Pages

November 02, 2007

Ultrasensitive prototype device approaches gold standard for magnetic field detection

A tiny sensor that can detect magnetic field changes as small as 70 femtoteslas-equivalent to the brain waves of a person daydreaming-has been demonstrated at the National Institute of Standards and Technology (NIST). The sensor could be battery-operated and could reduce the costs of non-invasive biomagnetic measurements such as fetal heart monitoring. The device also may have applications such as homeland security screening for explosives.

Described in the November issue of Nature Photonics,* the prototype device is almost 1000 times more sensitive than NIST's original chip-scale magnetometer demonstrated in 2004 and is based on a different operating principle. Its performance puts it within reach of matching the current gold standard for magnetic sensors, so-called superconducting quantum interference devices or SQUIDs. These devices can sense changes in the 3- to 40-femtotesla range but must be cooled to very low (cryogenic) temperatures, making them much larger, power hungry, and more expensive.

The NIST prototype consists of a single low-power (milliwatt) infrared laser and a rice-grain-sized container with dimensions of 3 by 2 by 1 millimeters. The container holds about 100 billion rubidium atoms in gas form. As the laser beam passes through the atomic vapor, scientists measure the transmitted optical power while varying the strength of a magnetic field applied perpendicular to the beam. The amount of laser light absorbed by the atoms varies predictably with the magnetic field, providing a reference scale for measuring the field. The stronger the magnetic field, the more light is absorbed.

The new NIST mini-sensor could reduce the equipment size and costs associated with some non-invasive biomedical tests. (The body's electrical signals that make the heart contract or brain cells fire also simultaneously generate a magnetic field.) The NIST group and collaborators have used a modified version of the original sensor to detect magnetic signals from a mouse heart.** The new sensor is already powerful enough for fetal heart monitoring; with further work, the sensitivity can likely be improved to a level in the 10 femtotesla range, sufficient for additional applications such as measuring brain activity, the designers say.

To make a complete portable magnetometer, the laser and vapor cell would need to be packaged with miniature optics and a light detector. The vapor cell can be fabricated and assembled on semiconductor wafers using existing techniques for making microelectronics and microelectromechanical systems (MEMS). This design, adapted from a previously developed NIST chip-scale atomic clock, offers the potential for low-cost mass production.


New Scientist also has coverage

It is not just much smaller than a SQUID, but also operates at much higher temperatures, at around 150 °C. Currently the complete device is a few millimetres on each side. "The small size and high performance of this sensor will open doors to applications that we could previously only dream of," Kitching says.

Kitching and colleagues made the new magnetometers through photolithography, the same process used to make computer chips. "You can make very large numbers of the devices in parallel on a single wafer [of silicon]," Kitching says. "That will reduce the cost."


FURTHER READING
The new devices are cheaper and easier to operate versions of SQUIDs. SQUIDS have been used for brain imaging.


Most of the magnetoencephalography (MEG) helmet array bulk is for cooling

MEG has been in development since the 1960s but has been greatly aided by recent advances in computing algorithms and hardware, and promises good spatial resolution and extremely high temporal resolution (better than 1 ms); since MEG takes its measurements directly from the activity of the neurons themselves its temporal resolution is comparable with that of intracranial electrodes. MEG's strengths complement those of other brain activity measurement techniques such as electroencephalography (EEG), positron emission tomography (PET), and fMRI whose strengths, in turn, complement MEG. Other important strengths to note about MEG are that the biosignals it measures do not depend on head geometry as much as EEG does (unless ferromagnetic implants are present) and that it is completely non-invasive, as opposed to PET and possibly MRI/fMRI.


OTHER RELATED
There are cheap brain wave (mind machine interface devices) readers that are $200-600 and use electroencephalography (EEG, detect electrical activity of the brain) or EMG (electromyograph)devices and work with Xbox and PS3's

Climate change bill has passed a senate panel

UPDATE CORRECTION: Thanks to reader Kurt9, who pointed out that I had been too hasty in reading the Forbes article. It has only passed a panel and not been passed in the Senate. Current handicapping is that it will not get 60 votes in the Senate

The UK Guardian has their take on it


The Lieberman/Warner climate change bill passed a senate panel today. This is the one which if passed could triple nuclear and renewables by 2030. This is good and hopefully it can pass the larger senate and house. Bush might not veto it because it has business support.

Here is my past coverage on the bill



My coverage of the similar Lieberman/McCain climate bill which had EIA analysis


The climate change will massively boost nuclear and renewables and reduce coal usage

Transitioning from oil

What if we had to transition from oil in a hurry ? The goal would be to get to electrified transportion, some biofuels, and an energy efficient economy.

So the transition is conservation, drill in ANWR and elsewhere, more oil from the oilsands and shale, temporarily ramp up biofuels and alternatives, ramp up nuclear and wind and solar, convert to more efficient cars, trucks, rail, planes, ships and industry and shift from liquids to electrification. Shift more freight from trucks to barges and rail.

Conservation measures can be initiated and strengthened when necessary.
55mph speed limit saved (380000 gallons per day back in the 1970s in the USA)


Increased fuel use for transporation (around double the 1970s usage) Fuel saving of 750,000 gallons per day from 55 mph speed limit in the USA

Delaware has emergency fuel shortage measures



Carpooling, transit, odd-even and other measures can reduce fuel usage by 8-15% right away and several can be sustained without harming the economy. A mid-term transition would be to require and setup satellite offices and wifi buses and trains (so that people could be productive while travelling on transit)

Florida has general energy saving tips

If peak oil hit, ANWR would be drilled. Which is basically an emergency fuel reserve of perhaps 10.3 billion barrels *(mean estimate). It would take a few years to being online and would supply about 1.4 million barrels per day.

More Nuclear power plants are being licensed now in the USA and around the world. The past peak was 12 reactors completed in one year in the USA. The US economy is over twice as large now. Full (non-emergency production of nuclear plants in the USA would be twenty-four 1.5 GW reactors completed per year and could be at that level by 2020. Business as usual production could be far higher if a climate change bill is passed which would make coal more expensive. There has already been the legislation in place to ensure that about 32 new nuclear reactors will be built by 2024


EIA projection based on climate change bill passage

Diesel and truck engine efficiency work is well under way. Which could double diesel and truck efficiency over the 2010-2020 rollout period.

http://www1.eere.energy.gov/vehiclesandfuels/resources/proceedings/2007_deer_presentations.html
http://www1.eere.energy.gov/vehiclesandfuels/resources/proceedings/2006_deer_presentations.html
http://www.theautochannel.com/news/2005/08/31/141727.html

Electrification of vehicles
There are 60 million electric scooters and bicycles in China already. By 2015 there could be 500 million in China. This would be the faster and easier route for the rest of the world as well. The vehicles can go at 55 mph (72 volt versions). There can be GPS enforcement of 55mph limits. Folding electric bikes are compatible with transit. Major transition conversion possible if needed by 2015.

Alternative fuels will help as well

In the US biodiesel : Total biodiesel production shot up from 25 million gallons in 2004 to 250 million gallons last year. Worldwide biofuel is at 51 billion liters (about 13 billion gallons) in 2007. USA production projection of 2.5 to 3.5 billion gallons by 2010. 7.5 billion gallons of biofuels per year by 2012
http://money.cnn.com/2007/09/25/technology/biodieselboom.biz2/

Thermoelectronics will start to be significantly rolled out by 2010



Superconducting motors in 2010 for industrial efficiency

Superconducting power grids would save up to 5% of the electricity in the USA. There is about 8% lost to power transmission.

If it was required a draconian conservation scheme could be instituted for 5 years to reduce fuel usage by 50% or more (WW2 style conservation, rationing) and then a mobilized effort to convert to electrification. After which the economy would be completely sustainable and able to grow again.

November 01, 2007

High endurance and enhanced longevity mouse created

One of my readers, Mav, pointed out that physorg.com has announced: Case Western Reserve University researchers have bred a line of “mighty mice” (PEPCK-Cmus mice) that have the capability of running five to six kilometers at a speed of 20 meters per minute on a treadmill for up to six hours before stopping.

This announcement is related to the myostatin blocking drugs which are four times more effective at building muscle than high doses of steroids.

“They are metabolically similar to Lance Armstrong biking up the Pyrenees; they utilize mainly fatty acids for energy and produce very little lactic acid,” said Richard W. Hanson, the Leonard and Jean Skeggs Professor of Biochemistry at Case Western Reserve and the senior author of the cover article that appeared in the Journal of Biological Chemistry, entitled “Over Expression of the Cytosolic Form of Phosphoenolpyruvate Carboxykinase (GTP) in Skeletal Muscle Repatterns Energy Metabolism in the Mouse.”

These genetically engineered mice also eat 60 percent more than controls, but remain fitter, trimmer and live and breed longer than wild mice in a control group. Some female PEPCK-Cmus mice have had offspring at 2.5 years of age, an amazing feat considering most mice do not reproduce after they are one year old. According to Hanson, the key to this remarkable alteration in energy metabolism is the over-expression of the gene for the enzyme phosphoenolypyruvate carboxykinases (PEPCK-C).

As part of this study, the researchers determined oxygen consumption, the production of carbon dioxide and changes in the lactate concentrations in the blood of the PEPCK-Cmus mice and controls during strenuous exercises on a treadmill, which was set at a 25-degree incline. The treadmill speed was increased by 2m/min every minute until the mice stopped running. The PEPCK-Cmus mice ran an average of 31.9 minutes, compared to 19 minutes for the control animals.

This new mouse line also has an increased content of mitochondria and high concentrations of triglycerides in their skeletal muscles, which also contributed to the increased metabolic rate and longevity of the animals.

Scanning tunneling microscope made 100 to 1000 times faster

From physorg.com, using an existing technique in a novel way, Cornell physicist Keith Schwab and colleagues at Cornell and Boston University have made the scanning tunneling microscope (STM) -- which can image individual atoms on a surface -- at least 100 times faster.


The simple adaptation, based on a method of measurement currently used in nano-electronics, could also give STMs significant new capabilities -- including the ability to sense temperatures in spots as small as a single atom.

But while current can change in a nanosecond, measurements with the STM are painfully slow. And the limiting factor is not in the signal itself: It's in the basic electronics involved in analyzing it. A theoretical STM could collect data as fast as electrons can tunnel -- at a rate of one gigahertz, or 1 billion cycles per second of bandwidth. But a typical STM is slowed down by the capacitance, or energy storage, in the cables that make up its readout circuitry -- to about one kilohertz (1,000 cycles per second) or less.

Researchers have tried a variety of complex remedies. But in the end, said Schwab, an associate professor of physics at Cornell, the solution was surprisingly simple. By adding an external source of radio frequency (RF) waves and sending a wave into the STM through a simple network, the researchers showed that it's possible to detect the resistance at the tunneling junction -- and hence the distance between the probe and sample surface -- based on the characteristics of the wave that reflects back to the source.

The technique, called reflectometry, uses the standard cables as paths for high-frequency waves, which aren't slowed down by the cables' capacitance.

"There are six orders of magnitude between the fundamental limit in frequency and where people are operating," said Schwab. With the RF adaptation, speeds increase by a factor of between 100 and 1,000. "Our hope is that we can produce more or less video images, as opposed to a scan that takes forever."

The setup also offers potential for atomic resolution thermometry -- precise measurements of temperature at any particular atom on a surface -- and for motion detection so sensitive it could measure movement of a distance 30,000 times smaller than the size of an atom.

October 31, 2007

Superlative forecasters exist and being a scout of good forecasters

Richard Jones of softmachines.org commented about how forecasting is unreliable because of bad forecasts He lists a prediction by Glenn Seaborg, then chair of the Atomic Energy Commission, predicting in 1971 that there would be 2100 billion Kwh of generating power for nuclear power in the USA in 2000. There was 780 billion kwh. It was a relatively linear prediction based upon the buildout rate in 1971 and the projection of known license applications out to 1976. This was part of a discussion where Richard is trying to show that futurism cannot be reliable and that superlative technology predictions should be abandoned. I completely disagree with him.


Of course in 1971 there was only about 50 billion kwh of nuclear power. So the right prediction was 1600% growth over 30 years instead of 4800%. So 10% per year growth rate instead of 13.75%.

Most forecasters are not very good. However, it is a relative thing like baseball. Hitting at a 0.400 rate or higher and you are a hall of famer. Hitting at 0.150 or less and you do not make the major leagues. Plus there is the quality of the swings.

The sport of being able to spot seemingly high profile bad predictions is a mostly useless endeavor. It is like a scout picking someone for the Yankees because they played well in the government civil service leagues and then people marveling at the inadequate performance. I don't understand man Glenn Seaborg led the Atomic Energy Commission league. He was following in the grand tradition of expert forecasters like, hmm, no one from the Atomic Energy Commission has a good track record of forecasting. The census bureau does alright but those predictions are that most people alive stay alive and get older while they are alive and we will have a certain rate of births and immigration somewhat correlate to what happened in preceding years. Man I thought Glenn had the stuff to be a good predictor of energy markets. He had that long history of being a politician and a bureaucrat. What a disappointment. I am shocked. Shocked. that his prediction was not better.

The vast majority of the impact is from those who are very good.

Plus better predictions are from those who would stand to make or lose money based upon the accuracy of their prediction. What were the best commodities traders predicting ?


Billionaire Jim Rogers, legendary commodities trader, who picked the bottom of the commodities bull market in 1999. With George Soros, Jim Rogers co-founded the Quantum Fund in 1970. I had bought his 2004 book and knew that he had a good record on commodities. He identified the impact of the rise of China on commodities well in advance.

A google search with a lot discussion about Jim rogers picking the 1999 bottom

Jim Rogers book on the long commodity boom he predicted starting in 1999

So the flaw is in looking to regulatory bodies where there are people with a government job putting together a forecast for accuracy. Sources that are consistently wrong should not be turned to again and again for another prediction.

Do you also look for Securities and exchange commission civil servant to give you stock picks ? Would you then cite a book on how most people, even "experts", underperform the indexes in stock portfolio performance.

The better course of action is to look and find the consistent winners in picks and predictions and strategy.

Celebrating forecasting losers who have some kind of claim to authority but inaccurate predictions is bad strategy. All I respect is proven accuracy on predictions and the ability to select the correct high impact factors.

The list of losers is long. It is useful to know why they were losers and what the flaw was in thinking that they should have been right. Learn the lessons for identifying winners. Accept the unbiased feedback of the facts and results.

Forecasting is another area to seek out those who are Superlative. Superlative forecasters: they exist too.

A good forecaster also needs to be a scout of other forecasters. As a scout of forecasters one has to have the skills to identify what quality predictions look like. It is seeing the ability of the forecaster to spot the right big trends from the root cause. Being able to know an earthquake of a certain size will generate tsunamis and where they will hit. Saying that someone who tries to throw a dart from 29 paces (the nuclear prediction) is a bad forecaster when they hit the wall 4 feet above the board is not correct. It is knowing that throwing in the right direction and hitting the barn from 29 paces was actually a decent throw. It is also knowing that the forecaster was not very good based on his use of a linear projection prediction without having a more sophisticated model.

It was actually not a horrible prediction. But it was inferior because it was only a linear projection without identification of key factors with which the projection could be updated. It was also inferior for not identifying key factors such as the potential development of nuclear fusion, vastly superior wind and solar, lower natural gas prices etc...

Jim Rogers was good not just for picking the bottom but for getting the reasons right for why there was a bottom and why there would be a commodity boom. Plus he figured out how he and those who believed him could make a lot of money by his being right.

Superlative scouts and identifiers of superlative forecasters and forecasting methods: they exist too.

It is useful to know that there can be track record for technological forecasting. There is a track record for sports futurists. I would not turn to a university professor in some field related to sports or a government official in charge of an department related to overseeing sports. I would turn to sports handicappers. People who have a track record of picking winners and whose track record has remained good over recent years (not resting on past glory). Also, I would try to look at the specific record for the specific sport. Don't ask the college football whiz about horse racing. There are plenty of publicly available forecasts on different aspects of technology. The more useful and profitable exercise is looking at who has a good record with technology forecasting. Also, industry types who predict Microsoft will maintain operating system market share are less useful than say Steve Jurvetson, Peter Thiel types who pick startup winners that become multi-billion dollar companies.

90% future prediction accuracy for Game theory model

Bruce Bueno de Mesquita has led a shift in political science toward quantitative models. Analyses of his model of political decision-making show that it has a 90 percent accuracy rate.

Like a 0.1 version of the fictional character Hari Seldon of the Asimov Foundation series.

The elements of the model are players standing in for the real-life people who influence a negotiation or decision. At each round of the game, players make proposals to one or more of the other players and reject or accept proposals made to them. Through this process, the players learn about one another and adapt their future proposals accordingly. Each player incurs a small cost for making a proposal. Once the accepted proposals are good enough that no player is willing to go to the trouble to make another proposal, the game ends. The accepted proposals are the predicted outcome.

To accommodate the vagaries of human nature, the players are cursed with divided souls. Although all the players want to get their own preferred policies adopted, they also want personal glory. Some players are policy-wonks who care only a little about glory, while others resemble egomaniacs for whom policies are secondary. Only the players themselves know how much they care about each of those goals. An important aspect of the negotiation process is that by seeing which proposals are accepted or rejected, players are able to figure out more about how much other players care about getting their preferred policy or getting the glory.

The main reason that the model generates more reliable predictions than experts do is that "the computer doesn't get bored, it doesn't get tired, and it doesn't forget," he says. In the analysis of nuclear technology development in Iran, for example, experts identified 80 relevant players. Because no individual can keep track of all the possible interactions between so many players, human analysts focus on five or six key players. The lesser players may not have a lot of power, Buena de Mesquita says, but they tend to be knowledgeable enough to influence how key decision-makers understand the issues. His model can keep track of those influences when a human can't.

"Given expert input of data for the variables for such a model, it would not surprise me in the least to see that it would perform well," says Branislav L. Slantchev, a political scientist and game theorist at the University of California at San Diego.

He points out that the model relies on having a considerable amount of expert input. "Honestly, if you had all this information," Slantchev says, "you should be able to predict fairly well how the issue would be resolved." The main reason that the model does this better than experts is that it "strips ideological blindfolds, cultural prejudice, and normative commitments that very often color the view of experts."


So this shows that by training oneself and learnnig what the proper inputs are in determining a future outcome and then rigorously reducing biases and focusing solely on an accurate assessment and prediction then an expert person could also achieve near 90% accuracy in predictions.

October 30, 2007

Infrared light used as better optical tweezers on silicon

What's new in the optical tweezer from MIT's Matt Lang and David Appleyard is that they used infrared light to move particles on silicon, the basis of microchips. (Unlike visible light, the infrared does not bounce off the silicon.) That means that MIT's optical tweezer can be used not just for study but to build structures on the surface of chips.


16 live E. coli cells to spell out "MIT" on a chip

Lang and Appleyard proved their technique by getting 16 live E. coli cells to spell out "MIT" on a chip. The long-term potential is more practical: Lang envisions using the system to cram high-resolution sensors in very small spaces — for disease detectors, for example — and to connect silicon-based electronics to living tissues and other "biological interfaces."

Arthur Ashkin, a retired Bell Laboratories scientist who is considered the father of optical tweezers, cautioned that the MIT work could not be considered a breakthrough, since no devices using the technology have yet been built.


Functional integration of optical trapping techniques with silicon surfaces and environments can be realized with minimal modification of conventional optical trapping instruments offering a method to manipulate, track and position cells or non-biological particles over silicon substrates.
This technique supports control and measurement advances including the optical control of silicon-based microfluidic devices and precision single molecule measurement of biological interactions at the semiconductor interface. Using a trapping laser in the near infra-red and a reflective imaging arrangement enables object control and measurement capabilities comparable to trapping through a classical glass substrate. The transmission efficiency of the silicon substrate affords the only reduction in trap stiffness. We implement conventional trap calibration, positioning, and object tracking over silicon surfaces. We demonstrate control of multiple objects including cells and complex non-spherical objects on silicon wafers and fabricated surfaces.

Programmable-metallization-cell (PMC) memory, or nano-ionic memory could start replacing flash memory in 18 months

A new type of memory technology could lead to thumb drives or digital-camera memory cards that store a terabyte of information--more than most hard drives hold today. The first examples of the new technology, which could also slash energy consumption by more than 99 percent, could be on the market within 18 months

UPDATE:I have more details on the PMC memory in a new article.

The new type of memory, called programmable-metallization-cell (PMC) memory, or nano-ionic memory, has been under development at the University of Arizona and at companies such as Sony and IBM. Nano-ionic memory is significantly faster than flash memory, and the speed of some experimental cells has rivaled that of DRAM, which is orders of magnitude faster than flash.

The memory could also prove easy to make. Recently, the Arizona group published work demonstrating that nano-ionic memory can be made from materials conventionally used in computer memory chips and microprocessors. That could make it easier to integrate with existing technologies, and it would mean less retooling at factories, which would appeal to manufacturers.

Another reason that ionic memory is attractive is that it uses extremely low voltages, so it could consume as little as a thousandth as much energy as flash memory. In theory, it could also achieve much higher storage densities--bits of information per unit of surface area--than current technologies can.

Each memory cell consists of a solid electrolyte sandwiched between two metal electrodes. The electrolyte is a glasslike material that contains metal ions. Ordinarily, the electrolyte resists the flow of electrons. But when a voltage is applied to the electrodes, electrons bind to the metal ions, forming metal atoms that cluster together. These atoms form a virus-sized filament that bridges the electrodes, providing a path along which electrical current can flow. Reversing the voltage causes the wire to "dissolve," Kozicki says. The highly resistive state of the electrolyte and the other, low-resistance, state can be used to represent zeroes and ones. Because the metal filament stays in place until it's erased, nano-ionic memory is nonvolatile, meaning that it doesn't require energy to hold on to information, just to read it or write it.

William Gallagher, a senior manager for exploratory nonvolatile-memory research at IBM Research, says that nano-ionic memory is one of several promising next-generation memory technologies. These include MRAM, which stores information using magnetic fields, and phase-change memory, which stores information in a way similar to that used to store bits on DVDs. Gallagher says that ionic memory's competitors have a head start on it. MRAM chips are already sold for some special applications, such as devices that will be exposed to harsh environments. But MRAM may also prove better for high-speed memory applications than as a replacement for flash, so it may not compete directly with nano-ionic memory. Samsung, however, could be selling a phase-change-based flash-replacement memory within a year.

Still, nano-ionic memory may not be far behind. A few companies have licensed nano-ionic-memory technology developed at the University of Arizona. These include Qimonda, based in Germany; Micron Technologies, based in Boise, ID; and a Bay Area stealth-mode startup. The startup is well on the way to producing its first memory devices, which Kozicki says could be available within 18 months. These first chips, however, won't rival hard drives in memory density, he says.

Engineers develop world's most complex silicon phased-array chip

Phased array chip will drastically lower the cost of phased array radar and wireless super-high speed communication.

Some phased arrays are larger than highway billboards and the most powerful – used as sophisticated radar, surveillance and communications systems for military aircraft and ships – can cost hundreds of millions of dollars. The high cost has prevented significant spread beyond military and high-end satellite communication applications. Engineers are now working to miniaturize them and fully integrate them into silicon-based electronic systems for both military and commercial applications.


The new UCSD chip packs 16 channels into a 3.2 by 2.6 mm² chip. The input signal is divided on-chip into 16 different paths with equal amplitude and phase using an innovative design, and the phase and gain of each of the 16 channels is controlled electronically to direct the antenna pattern (beam) into a specific direction.

By manipulating the phase, you can steer the beam electronically in nanoseconds. With the amplitude, you control the width of the beam, which is critical, for example, when you send information to from one satellite to another but you don’t want the signal to reach any nearby satellites. And with amplitude and phase control, you can synthesize deep nulls in the antenna pattern so as to greatly reduce the effect of interfering signals from neighboring transmitters.

If you take the same design and move it to the 24 or 60 GHz range, you can use it for commercial terrestrial communications,” said Rebeiz who is also a lead on a separate project, funded by Intel and a UC-Discovery Grant, to create silicon CMOS phased array chips that could be embedded into laptops and serve as high speed data transfer tools.

“If you wanted to download a large movie file, a base station could find you, zoom onto you, and direct a beam to your receiver chip. This could enable data transfer of hundreds of gigabytes of information very quickly, and without connecting a cable or adhering to the alignment requirements of wireless optical data transfer,” explained Rebeiz who estimated that this kind of system could be available in as little as three years.

October 29, 2007

Wealthy people still motivated to be highly productive

A post from 2005 in the NY Times which discusses the thinking and motivation of wealthy entrepreneurs It shows that even if there is great wealth created in the future that there will be significant numbers of people who will still be highly motivated to continue to try and create the next big thing.

One interesting development from Facebook being valued at $15 billion is that Peter Thiel, likely owns 5-7% of Facebook.

Peter Thiel put $3.5 million of support for the Sens life extension research and prizes.

Peter made that donation when he had a net worth of about $100 million. A successful IPO for Facebook in 2008 would likely place Peter's net worth north of one billion dollars. This will likely be increase the odds for further support for Sens.

Gene therapy radiation protection

University of Pittsburgh researchers injected a therapy previously found to protect cells from radiation damage into the bone marrow of mice, then dosed them with some 950 roentgens of radiation -- nearly twice the amount needed to kill a person in just five hours. Nine in 10 of the therapy-receiving mice survived, compared to 58 percent of the control group.

Between 30 and 330 days, there were no differences in survival rates between experiment and control group mice, indicating that systemic MnSOD-PL treatment was not harmful to survival.

The researchers will need to verify whether this treatment would work in humans.

October 28, 2007

Lunar lander challenge is yesterday and today

Here is a blog post from the xprize organization with pictures

UPDATE: There appears to be no winner in the lunar lander competition for 2007 as Armadillo's entry failed and burned in its fourth attempt

Wired magazine is blogging the lunar lander xprize cup competition as well


The Armadillo Aerospace had one of two good flights needed for the level 1 prize. The second flight was short of the 90 seconds needed by a few seconds

Level 1 requires a rocket to take off from a designated launch area, rocket up to 150 feet (50 meters) altitude, and then hover for 90 seconds while landing precisely on a landing pad nearly 330 feet (100 meters) away. The flight must then be repeated in reverse - and both flights, along with all of the necessary preparation for each, must take place within a two and a half hour period.


They have another craft that will try for level 2 later Oct 28, 2007
Level 2 requires the rocket to hover for twice as long before landing precisely on a simulated lunar surface, packed with craters and boulders to mimic actual lunar terrain. The hover times are calculated so that the Level 2 mission closely simulates the power needed to perform a real lunar mission.

More on the DARPA urban grand challenge robotic car competition

TG Daily has Darpa urban grand challenge coverage (pictures, video and articles).

There is official coverage at the DARPA site

Links to all of the teams is here

UPDATE:
There have been some crashes in the semi-final competitions

Wired magazine is also covering the Darpa urban challenge

UPDATE Monday October 29:
slow start the competition in this Sunday report from Ventura county star

A Monday preview from the NY Times bits blog on the Stanford entry, Junior

Wired has coverage of runaway vehicle in the qualifying that was stopped and repaired