Pages

July 29, 2006

other: Sex could be "killer app" for space

Sex would be "the killer app of space tourism ... because every couple who goes up there, or threesome or whatever their personal choice is, is going to want to try this. However, there are difficulties to be overcome.



Vanna Bonta's concept for the "2suit" garment includes Velcro strips, zippers and diaphanous inner material that would be designed for intimacy in the near-weightless environment of space.

Bigelow aerospace is planning to launch a space hotel.

July 28, 2006

Nanoparticle infrared detector Is Ultrasensitive, Cheap

Canadian researchers have developed an inexpensive and highly sensitive infrared chip that could improve night-vision goggles and medical imaging. Made by spin-coating a glass slide or silicon chip with a solution of conducting nanoparticles called quantum dots, the detector is 10 times more sensitive than traditional infrared detectors.

The team that designed the chip is led by Edward Sargent, who holds the Canada research chair in nanotechnology at the University of Toronto. The chip picks up the near and short-wave infrared (SWIR) bands. SWIR light is abundant at night, even when it's cloudy or moonless. In such conditions, conventional night-vision goggles, which work by amplifying star light from the redder near-infrared band, are ineffectual.

Sargent says infrared cameras based on InGaAs chips now cost $40,000 to $60,000, whereas his technology could lead to much cheaper cameras. The cost of coating a square meter with the quantum dot solution is $17, he says, and speculates that infrared cameras might one day cost as little as today's digital cameras.

Singularity/AI related: responses of neurons of living animal watched

To this advance: I say holy crap. A very creative piece of work. It is and will massively and rapidly increase detailed understanding of brain functions.

Thanks to a new imaging system, researchers at MIT's Picower Institute for Learning and Memory have gotten an unprecedented look into how genes shape the brain in response to the environment. This is the first study that demonstrates the ability to directly visualize the molecular activity of individual neurons in the brain of live animals at a single-cell resolution, and to observe the changes in the activity in the same neurons in response to the changes of the environment on a daily basis for a week.

This advance, coupled with other brain disease models, could "offer unparalleled advantages in understanding pathological processes in real time, leading to potential new drugs and treatments for a host of neurological diseases and mental disorders," said Nobel laureate Susumu Tonegawa, a co-author of the study.

Tonegawa, director of the Picower Institute and the Picower Professor of Biology and Neuroscience at MIT, Wang and colleagues found that visual experience induces a protein that works as a molecular "filter" to enhance the overall selectivity of the brain's responses to visual stimuli.

The protein, called "Arc," was previously detected in the hippocampus, where it is believed to help store lasting memories by strengthening synapses, the connections between neurons. The Picower Institute's unexpected finding was that Arc also blocks the activity of neurons with low orientation selectivity that are not well "tuned" to vertical and horizontal lines, while keeping neurons with high orientation selectivity.

To come up with a better way to investigate this, the MIT team developed a state-of-the-art imaging system in which transparent cranial windows were implanted over the primary visual cortex, allowing the researchers to monitor over time the expression of proteins in the brains of live mice.

The study exploited the power of two-photon microscopy (so-called because it uses two infrared photons to emit fluorescence in tissue), which allows imaging of living tissue up to 1 millimeter deep, enough for researchers to see proteins expressed within individual neurons within the brain.

They then created a mouse model in which a coding portion of the Arc gene was replaced with a jellyfish gene encoding a green fluorescent protein (GFP). Neural activities that normally activate the Arc gene then activated the GFP, leaving a fluorescent trace detectable by two-photon microscopy. This allowed the researchers to image neuronal activation patterns induced by visual experience, thus uncovering the Arc protein's role in orchestrating neurons' reactions to natural sensory stimuli.

The genetically engineered mice were let loose in an environment containing a cylinder covered with stripes of vertical or horizontal lines, and the proteins in their brains were monitored as the mice saw the cylinders daily.

Cheap, Ceramic thin film radar

A thin magnetic ceramic film eliminates the need for circulators -- heavy and expensive device integral to a lot of radar technology. Traditionally, circulator designs have relied on large and heavy magnets that add significant cost to the technology.

Thousands of the magnets are required for the most advanced radar systems and, as a result, radar platforms can weigh several tons and take up an inordinate amount of space.

The new material possesses a spontaneous magnetic moment sufficient to eliminate the need for magnets. It is millimeter-thick films of Ba-hexaferrite. It is produced using a screen printing processing scheme that meets all the necessary specifications for radar performance and is, in addition, highly cost-effective.

Quantum computer scenario from Fortune Magazine

A projection of life with more advanced computers in 2030 is from Fortune Magazine. It claims to be projecting quantum computers but mixes in spintronics, brain-computer interfaces and other technology.

Hatband computer is communication center and intelligent assistant, which scans and sorts the 500,000 e-mails at night and then sends results directly to the brain. Quantum computer weather simulations forecast accurately out 5 years.

Using spintronics for computation not just memory. A team at the University of California at Santa Barbara, led by David Awschalom, has made big progress in this direction by controlling electron spins in semiconductors and other materials a few nanometers in size. In 2004, Dan Rugar of IBM performed what the American Institute of Physics dubbed the most important experiment of the year by using a magnet to control the spin of a single electron.

The article has forecasts about ubiquitous computing (computers everywhere), human level Artificial intelligence, and brain computer interfaces.

Transparent Fiber optic grid cameras

From MIT Technology Review: Light sensing optical fibers can make transparent cameras, uniforms that detect laser sightings, put onto screens for laser pointer control of the computer.



A system using these fibers could lead to transparent cameras that need no lenses.


The fibers are made of a semiconducting glass core, lined along its full length by wires that act as positive and negative electrodes, and surrounded by a transparent polymer. When light hits the photosensitive core, an electrical current in the fiber changes, registering the hit. A mesh of these fibers can then be used to identify the location of the light on a surface.For direction sensing, the researchers formed a grid of fibers into a sphere. A light beam from a flashlight first hits one side of the sphere and the grid registers the location. The light then passes through the sphere and out the other side, where it is detected again. Then an integrated circuit compares the entrance and exit points to calculate the path of the light.

The resolution of the images is limited by the need to space the fibers within the grid far enough apart that the first grid does not distort the image received by the second. The grids themselves also need to be separated, which could make the current system difficult to incorporate into some applications, such as on the skin of a car, where keeping the grids at a distance wouldn't be practical. But the researchers say work is currently being done that could overcome these limitations.

Nanomaterials and metamaterials could be applied to further enhance the capabilities after it is more refined.

July 27, 2006

London case study of widespread video monitoring

The main center of the city of London is two-and-a-half-square kilometers. In 1998 a network of cameras was installed that provides comprehensive video coverage of a large part of the City. Every vehicle entering the area is photographed, its license plate checked against a national police database, and an image of its driver stored for posterity. Earlier this year, the New York City Police Department announced that it was installing more than 500 cameras around the city and pushing for its own ring of steel to protect lower Manhattan.

As part of the original system, they reduced traffic coming into the city and reduced the number of streets for entry. Unexpectedly, the introduction of cameras had a big effect on the environment. The traffic-channeling measures not only slowed traffic but also reduced the number of vehicles entering the area, substantially improving air quality. It also allowed city planners to turn many roads that were no longer accessible into pedestrian malls. The result: a more pleasant working environment for many Londoners.

Today, the accuracy of automatic license plate recognition approaches 100 percent for cars traveling at ordinary city speeds in a wide range of lighting conditions. One major challenge for surveillance officials is handling the data the London cameras produce. The system consists of over 200 cameras, each sending a 3.8‑megabit-per-second MPEG video feed to the control room of a police station in the heart of the City. Processing this data in real time requires 122 IBM xSeries servers with a total storage capacity of 200 terabytes.

Last year, the cameras recorded 38 million vehicle entries into the area. Of these, 91 000 were listed for infractions on the national computer; 4161 warranted police action, leading to 539 arrests. Many serious crimes were uncovered as a result of stopping a vehicle for a minor violation. “It gives us a way in,” says Mellor. “With good police work, a traffic offense is just the beginning.”

An example, a black Porsche Cayenne that was flagged by the computer last 13 February because the driver had not paid the car’s leasing bills. The police stopped the vehicle, searched it, and found US $20 000 in the glove compartment, triggering a major money-laundering investigation.

One thing that hasn’t been much of a public concern is privacy. The London terror attack was not detered but the system accelerated the investigation.

Understanding experts, the expert mind and expertise

An interesting article at scientific american about understanding how experts are different from novices

To a beginner, a position with 20 chessmen on the board may contain far more than 20 chunks of information, because the pieces can be placed in so many configurations. A grandmaster, however, may see one part of the position as "fianchettoed bishop in the castled kingside," together with a "blockaded king's-Indian-style pawn chain," and thereby cram the entire position into perhaps five or six chunks. By measuring the time it takes to commit a new chunk to memory and the number of hours a player must study chess before reaching grandmaster strength, Simon estimated that a typical grandmaster has access to roughly 50,000 to 100,000 chunks of chess information. A grandmaster can retrieve any of these chunks from memory simply by looking at a chess position, in the same way that most native English speakers can recite the poem "Mary had a little lamb" after hearing just the first few words.

Experts can also easily tap long term memory. All expertise theorists agree that it takes enormous effort to build these structures in the mind. Simon coined a psychological law of his own, the 10-year rule, which states that it takes approximately a decade of heavy labor to master any field. Even child prodigies, such as Gauss in mathematics, Mozart in music and Bobby Fischer in chess, must have made an equivalent effort, perhaps by starting earlier and working harder than others.

What matters is not experience per se but "effortful study," which entails continually tackling challenges that lie just beyond one's competence. That is why it is possible for enthusiasts to spend tens of thousands of hours playing chess or golf or a musical instrument without ever advancing beyond the amateur level and why a properly trained student can overtake them in a relatively short time.

motivation appears to be a more important factor than innate ability in the development of expertise. It is no accident that in music, chess and sports--all domains in which expertise is defined by competitive performance rather than academic credentialing--professionalism has been emerging at ever younger ages, under the ministrations of increasingly dedicated parents and even extended families.

Furthermore, success builds on success, because each accomplishment can strengthen a child's motivation. A 1999 study of professional soccer players from several countries showed that they were much more likely than the general population to have been born at a time of year that would have dictated their enrollment in youth soccer leagues at ages older than the average. In their early years, these children would have enjoyed a substantial advantage in size and strength when playing soccer with their teammates. Because the larger, more agile children would get more opportunities to handle the ball, they would score more often, and their success at the game would motivate them to become even better.

US would not need to go nuclear in overwhelming response to terrorist nuke

Various places online speculate about an overwhelming nuclear response by the United States to a terrorist nuclear bombing of a US city.

This may not be necessary and may not be the response. Right up there in death toll with the Hiroshima (140,000) and Nagasaki (70,000) bombs was the two day firebombing of Tokyo (100,000+ dead)

Modern conventional weapons have advanced with fuel-air explosives and other Thermobaric weapons There is Napalm B, also called super napalm. It burns for 20-40 times longer than regular Napalm

cluster bombs and large conventional bombs are also available in the conventional arsenal.

If the medical, utility and transportation infrastructure of an enemy is destroyed using conventional weapons then disease and starvation would rack up a big toll.

Clearly this is not something that anyone would want. Therefore, terrorists would be miscalculating if they thought positive objectives would result from the nuking of a US or western city.

The thing to understand is that whatever death toll is desired, overwhelming conventional can get you there if you have air superiority. Is there any question as to which country will have air superiority?

Against the Soviets the response to nuclear attack was counter nuclear attack. The reason is that they had far more weapons available so you needed a fast destructive response to try to limit the damage being inflicted upon you. If the new terorist enemy has mostly shot its load in the first wave, then you can take your conventional time of days, weeks and months extracting your price in response.

A better Ion space propulsion engine

Nasa making a more powerful ion engine (NEXT). NEXT can generate a force of 236 milliNewtons, compared to NSTAR's maximum of 92 mN. This corresponds to 6.9 kilowatts of engine power, compared to NSTAR's 2.3 kilowatts. A mission to Titan would take 20 kilowatts of engine power. An array of three NEXT engines plus a spare would be used. The engine was built by Aerojet, an aerospace company based in Sacramento, California, US.

related articles:
Past article on space technology including the DS4G, european ion engine project

Further reading:
A second phase Nasa Institute of Advanced Concepts studies is further developing the idea of using Scalable Flat-Panel Nanoparticle MEMS/NEMS Propulsion Technology. It is very efficient at a braod range of ISPs (100-10,000 seconds)

July 26, 2006

Fiber Optic array for finding Kuiper belt objects

Fiber optic telescope array used to look for Kuiper belt objects. There might be 5 to 10 times the 100 billion object estimate. The researchers looked for split-second 'winking', or darkening, of stars which suggests a Kuiper belt object is passing in front, or occulting the star. They found evidence of many objects ranging in size from 300 metres to one kilometre across using a 6DF instrument on the UK Schmidt telescope at Siding Spring.

The 6DF, which uses fibre optics, monitored 100 stars simultaneously over two weeks, the equivalent of 7000 star hours, or watching a single star every night for 3 years.

"We've got 100 fibres, each one of which is positioned on a star and then we feed the fibres into a high speed camera," he says. They saw at least 100 very definite occultations and 1000 less significant events.

The Kuiper belt community has greeted the news with some scepticsm. Some critics say that the apparent dimming of the stars may be due to effects in the Earth's atmosphere. About 1000 large bodies, including Pluto and the recently discovered Xena, have been located in the Kuiper belt so far. Smaller objects have evaded detection as they are about 15 billion kilometres from the Sun, making it impossible to see them even with a powerful instrument like the Hubble Space Telescope.


Ashley says the scientists took pains to rule out other possible causes for dips in stars, including moths in the telescope.

Other tech: superconducting 2G wire rolling out, 140 amps over 300+feet

The breakthrough announced yesterday, American Superconductor Corp said, is that 2G wire of lengths in excess of 300 feet can now conduct 140 amperes of electric current, which is over the commercial threshold of 120 amps. It is the first-ever commercial-grade 2G wire that has been produced by a high-volume low-cost scalable industrial process. The 2G wire meets the standards for integration into commercial power grids. The companies stock was up 25% so investors think this will have impact. To put the size and power of 2G wire into perspective, the company said that more than 100 copper wires of the same dimension would be needed to conduct as much current as one 2G wire, and that in a high-voltage power system, one 2G wire — which has a width of 4 millimeters, or two human hairs — would be able to carry enough power to serve 1,000 homes.

The company said it is on track to meet its previously announced benchmark of shipping 10,000 meters of 2G wire this fiscal year. “As production grows, we expect we will be able to lower the unit cost, while at the same time we expect to continue to increase the performance of the 2G wire,” Mr. Yurek said in an e-mail message. The company expects that by the end of the decade, the price-performance ratio of 2G wire will be equivalent to that of copper wire, he said.

The Devens plants is scaling up to its December 2007 operational target of 720,000 meters, or about 2.4 million feet, he said.

Atom Camera proposed

Better tools and sensing at the nanoscale are helpful to enabling progress to molecular manufacturing.

A newly devised nozzle fitted with a pinhole-sized capillary has allowed the scientists to distribute helium atoms with X-ray-like waves on randomly shaped surfaces. The researchers say that technique could power development of a new microscope for nanotechnology, allowing for a non-invasive, high-resolution approach to studying both organic and inorganic materials.

Physics Professor Stephen Kevan said all that is needed is a camera-like detector, which is now being pursued, to quickly capture images that offer nanometer resolution.

Kevan, the study's principal investigator, said if the project is successful, the approach would build on advances already achieved with emerging X-ray-diffraction techniques.

Follow up: Printing UAVs, 3D printing industry

Open the future and CRnano have pointed out the leading edge of the shift in military capability resulting from cheap yet capable UAVs. This site also noted the importance of rapid prototyping (RP) techniques being used to build capable end product. The new application is called Rapid Manufacturing.

From the 2006 wohler associates RP industry report,
- Rapid Manufacturing is 9.6% of the activity of Rapid Prototyping.
- 42% of installed systems are in the USA, 29.6% in Asia, 25.5% is Europe, 2.6% is other
- 5254 machines installed at the end of 2005, expected growth to 15,000 by 2010
- Rapid Manufacturing is inferior to established processes, blow molding, die casting, injection molding, sand casting, investment casting etc...
- Rapid Manufacturing is an additive process and does not require tooling

Rapid Manufacturing can become more useful with low cost integration with processes that can provide more flexibility and by increasing the range of materials.

Something to take note of. Almost all of the leading work and almost all of the leading capabilities in the enabling technology is in the US or with its close allies. China and Russia being the main exceptions.

best aeronautics - USA, Europe, Russia
printer technology - USA, (HP in particular)
Advanced Lasers - USA, europe, russia
making planes from mostly composites- USA, boeing (this is not trivial, notice the delays and problems that occur here.), Europe, Russia

3D printing industry is covered at Castle Island. Mostly US companies, but also leaders in europe and Israel. Some companies in Japan, Korea, China and Singapore

A table that compares different technology and companies is here

3D printing (developed at MIT), laser sintering, Laser engineered net shaping

Current limitations and performance of RP are summarized

Having more capital and a technological lead that is probably widening still matters.

Everyone has guns and bombs but those with better guns, gear and precision bombs win. Everyone will have UAVs but they will not equal either (quality and/or quantity).


Related reading:
Reprap (Replicating Rapid-Prototyper) information

Z Corp, one of the leading RP companies

German laser sintering company

Another list of RP companies and associations and other links

RP industry reports from Wohler associates. Need to register

Rapid Manufacturing Research Group, UK

July 25, 2006

Nanofactory collaboration website

The nanofactory is a proposed compact molecular manufacturing system, possibly small enough to sit on a desktop, that could build a diverse selection of large-scale molecularly precise diamondoid products. The nanofactory is potentially a high quality, extremely low cost, and very flexible manufacturing system.

The long-term goal of the Nanofactory Collaboration is to design, and ultimately to build, a working diamondoid nanofactory.

The most important new information on the site is the draft of the list of the remaining challenges to creating a nanofactory.


Introduction to the nanofactory concepts

The list of participants

A description of Diamond Machanosynthesis

Publications of participants in the collaboration.

A Code Beyond Genetics in DNA

The genetic code specifies all the proteins that a cell makes. The second code, superimposed on the first, sets the placement of the nucleosomes, miniature protein spools around which the DNA is looped. The spools both protect and control access to the DNA itself. There are about 30 million nucleosomes in each human cell. So many are needed because the DNA strand wraps around each one only 1.65 times, in a twist containing 147 of its units, and the DNA molecule in a single chromosome can be up to 225 million units in length.

follow up on AI/transhumanism

Velcro city posted about the original AGI/loads of abundance article.

I had cross posted to betterhumans

One commenter talked some about AI emotions and human poverty.

My response:
I do not consider AI emotions at all.

The AGI with any potential for danger is one that can and is rapidly improving its own technology. My assumption is that in order to be more effective at this task the AGI must be very good at Math, resource allocation, all sciences, scientific method and cost benefit analysis.

If it outclasses all people with these abilities, then I would submit that it would be trivial for the the AGI to create better access to space for itself. It could create a means to tap all of the Suns energy and the materials of the asteroid belts etc... It creates supernanotech and other tech for a Dyson shell of energy collectors.

Killing us for everything on Earth and everything that we have built or for the space we take up seems to be a waste of time and effort. Good/Evil/whatever. It is just a waste of time. It is like killing a toddler for its sand castle and access to its sand box. Sure you could do it easily, but why? If you want sand you can get it. The sand castle is useless to you because you can make something better.

Poverty is a human problem. I do not make any assumptions about whether AGI would help us on that. I would think that we should try to build better tech and create our own abundance and take care of it ourselves.

Rescaling the relative value of things.

when I talk abundance it is not about the "price of things". That we have some kind of boomtown and even though you make a bunch of money, they start charging you more so you cannot really buy more. I am saying that if you have the tech you can tap all of the power of the sun. Then the Saudi oil fields are like a thimble of energy.

=====
A new comment of mine. Not choosing to upgrade in the early days is like not using the best technology, computers and software. Then it is like not going to graduate school. Then like not going to college. Then like not going to high school. Then like not going to grade school. The pool of jobs available to you shrinks as your skills and capability lag. It would be more extreme than the choice to be unemployed and uneducated. If the upgraded human becomes the baseline and average choice, then the choice to fall behind becomes also the choice to become extremely handicapped. Someone who is 10-100 times weaker than average (like say the current very elderly) gets the blue parking pass. It would be a choice, just a very bad and stupid choice.

China Superconducting tokamak fusion device starting Aug 15

Hot fusion may not be the best way to go to solve our energy problems (in terms of cost or timeliness of making a difference). Making solar cheaper and better and creating space based solar power should be better. But with enough breakthroughs advanced fusion power could become very important.

the Institute of Plasma Physics at Hefei, under the Chinese Academy of Sciences (CAS), has successfully completed the first tests of the Experimental Advanced Superconducting Tokamak (EAST) fusion experiment. The final assembly of the device is complete, and it is now undergoing vacuumizing, cooling, and galvanizing experiments. The first plasma discharge is scheduled for august 15.

EAST started overall assembly in 2003, and was developed as an upgrade from HT-7, China’s first superconducting tokamak completed in 1990. The budget for the device was just 300 million yuan ($37 million) — a small fraction of the cost of the multi hundred million dollar price tag of similar devices being developed elsewhere.

If the device succeeds, China will become the first country to build and successfully demonstrate a superconducting tokamak fusion device. The goals for EAST include exploring and demonstrating steady-state operation of a tokamak and generating plasma currents of 1MA. With a capability for pulse times as long as 1,000 seconds, the device will also be used to investigate particle and heat flux handling on a time scale much longer than the wall equilibration time.

other tech: Fast military cargo ship holds 17 times C17

Navy High-Speed Vessel Swift (HSV 2) has 28,000-square-foot mission deck, the ability to traverse littoral (shallow) waters, speed in excess of 40 knots, and maneuverability that doesn't require tugboat assistance when arriving or departing the pier and cargo space 17 times a C-17. A sustained 40+ knot speed would mean an Atlantic crossing in less than 3 days Possibly 2 to 2.5 days. If they could make a ship or airship that could sustain 50-60 knots across the atlantic, then we could have cheap 2nd day delivery of packages. The specs are here. It might not be able to sustain 40+ knots for an entire Atlantic crossing.

July 24, 2006

from IEEE Spectrum: Metcalfe's Law is Wrong

It is important to know that Metcalfe's law is wrong. Telecommunications is a huge business as are social networking sites and other internet systems. Proper valuation of a network is important to get correct business decisions. When you get it wrong you end up with a dot bomb.

Metcalfe's Law says that the value of a communications network is proportional to the square of the number of its users. IEEE Spectrum explain that it is wrong. In March 2005, Andrew Odlyzko and Benjamin Tilly published a preliminary paper which concludes Metcalfe's law significantly overestimates the value of adding connections. The rule of thumb becomes: "the value of a network with n members is not n squared, but rather n times the logarithm of n." Their primary justification for this is the idea that not all potential connections in a network are equally valuable. For example, most people call their families a great deal more often than they call strangers in other countries, and so do not derive the full value n from the phone service.

Metcalfe's original point (from a 35mm slide circa 1980) was to establish the existence of a cost-value crossover point—critical mass—before which networks don't pay. The trick is to get past that point, to establish critical mass.

David P Reed proposed that the value of networks that allow the formation of groups, such as AOL's chat rooms or Yahoo's mailing lists, grows proportionally with 2**n.

There are common-sense arguments that suggest Metcalfe's and Reed's laws are incorrect. For example, Reed's Law says that every new person on a network doubles its value. Adding 10 people, by this reasoning, increases its value a thousandfold (2**10). But that does not even remotely fit our general expectations of network values—a network with 50 010 people can't possibly be worth a thousand times as much as a network with 50 000 people.

the N* log N rule is related to Zipf's law and the long tail.

To understand how Zipf's Law leads to the log(n) law, consider the relative value of a network near and dear to you—the members of your e-mail list. Obeying, as they usually do, Zipf's Law, the members of such networks can be ranked in the same sort of way that Zipf ranked words—by the number of e-mail messages that are in your in-box. Each person's e-mails will contribute 1/k to the total "value" of your in-box, where k is the person's rank.

Biometric authentication systems for credit cards could put identity thieves out of business

This is a survey from IEEE Spectrum about identity theft and the new biometric technology that should be used to reduce this problem.

According to data from the Aberdeen Group, Boston, the cumulative ID theft losses suffered by tens of millions of individuals and businesses worldwide registered at an estimated $221 billion in 2003. Aberdeen, which assumed an enormous 300 percent compound annual growth rate, projected that losses would rise to an almost unfathomable $2 trillion in 2005. More recent numbers from Javelin Strategy and Research, based in Pleasanton, Calif., indicate a much lower growth rate, at least in the United States, where total losses rose from about $48 billion in 2003 to $56.6 billion in 2005.

Clearly, it is far too easy to steal personal information these days—especially credit card numbers, which are involved in more than 67 percent of identity thefts, according to a U.S. Federal Trade Commission study.

The sensors, processors, and software needed to make secure credit cards that authenticate users on the basis of their physical, or biometric, attributes are already on the market. But so far, the credit card industry hasn’t seen fit to integrate even basic fingerprint-sensing technology with their enormous IT systems. Concerned about biometric system performance, customer acceptance, and the cost of making changes to their existing infrastructure, the credit card issuers apparently would rather go on eating an expense equal to 0.25 percent of Internet transaction revenues and the 0.08 percent of off-line revenues that now come from stolen credit card numbers.

Even if thieves fashion a latex glove molded in a slab of gelatin containing a nearly flawless print of your right index finger, painstakingly transferred from a cocktail glass. Such an effort would fail, thanks to new applications that test the vitality of the biometric signal. One identifies sweat pores, which are just 0.1 millimeter across, in the ridges using high-resolution fingerprint sensors. We could also detect spoofs by measuring the conduction properties of the finger using electric field sensors from AuthenTec Inc., of Melbourne, Fla. Software-based spoof detectors aren’t far behind. One of us (Jain) is currently leading an effort at Michigan State University, in East Lansing, in which researchers are differentiating the way a live finger deforms the surface of a sensor from the way a dummy finger does. With software that applies the deformation parameters to live scans, we can automatically distinguish between a real and a dummy finger 85 percent of the time—enough to make your average identity thief think twice before fashioning a fake finger.

Biometric authentication systems based on available technology would be a major improvement over conventional authentication techniques. If widely implemented, such systems could put thousands of ID thieves out of business and spare countless individuals the nightmare of trying to get their good names and credit back. Though the technology to implement these systems already exists, ongoing research efforts aimed at improving the performance of biometric systems in general and sensors in particular will make them even more reliable, robust, and convenient.

singularity related: Protein-Nanoparticle Material Mimics Human Brain Tissue

A composite material consisting of a horse protein and metallic nanoparticles displays magnetic properties very similar to those of human brain tissue, scientists have found. The work, published in the June 20 online edition of Physical Review B, may help lead to a more thorough understanding of the magnetic behavior of brain tissue and other complex natural materials. Better understanding of the brain may lead to better Artificial Intelligence. The technological singularity is about creating intelligence greater than human to start an explosive rate of technological progress.

nanoparticles: Fullerenes make MRI imaging agent 40 times better

Fullerenes used to make better, more stable MRI imaging agent. Investigators from the National Cancer Institute’s Cancer Nanotechnology Platform Partnership at Virginia Commonwealth University have developed a new imaging agent that is 40 times more potent at boosting magnetic resonance imaging (MRI) signals than agents currently approved for human clinical use.

New scanning mass spectrometry nanoprobe created for better cell analysis

Nano probe may open new window into cell behavior at physorg and eurekalert. Georgia Tech researchers have created a nanoscale probe, the Scanning Mass Spectrometry (SMS) probe, that can capture both the biochemical makeup and topography of complex biological objects in their normal environment -- opening the door for discovery of new biomarkers and improved gene studies, leading to better disease diagnosis and drug design on the cellular level.



Georgia Tech's SMS Probe gently pulls biomolecules precisely at a specific point on the cell/tissue surface, ionizes these biomolecules and produces "dry" ions suitable for analysis and then transports those ions to the mass spectrometer.