Pages

May 05, 2007

Better economy and government budgets by getting rid of coal

GoogleTechnoratidel.icio.usStumbleUpon




Wendell Cox has an analysis of using more rail instead of trucks to move freight This can be achieved without building more rail by switching away from coal power usage to nuclear power, efficiency programs and renewables to meet increasing energy demand and to eliminate the need for existing coal plants. Savings from reduced future highway spending and reduced oil imports could be used to offset a major buildout of alternative's to coal energy.

UPDATE: I spoke with Wendell Cox. Unfortunately there is less additional justification from tieing in the transportation aspect.
1. The traffic congestion is only helped by intermodal rail. So only the congestion that overlaps with long-haul freight would be helped with any shift in truck to rail freight.
2. The highway budget has already been largely repurposed for spending other than building highways. A small percentage might be justified but more likely it would be separate bond measures or new appropriations to get new funds. Highway funding to build actual highways is already lagging what is needed.
3. So what is left is that rail is 3-9 times more fuel efficient than trucks. So shifting some long haul freight with displaced coal capacity would save fuel based on the ton-miles that get shifted.

In 2005, railways were able to move 417 ton miles per gallon of diesel and trucks can move about 60 ton miles per gallon of diesel.

A pdf with rail freight statistics

Ton mile statistics for different kinds of transportation

In 2002, Coal used 562.5 bllion ton miles and the average shipment was 112 miles
If that coal could be displaced and freight that was otherwise sent via truck was shifted to rail in place of coal, then 8 billion gallons of fuel would be saved each year. At $3 per gallon that would be 24 billion dollars.



23 billion is being budgeted at the federal level to reduce congestion on highways in 2008 Each state also spends money on highways. California has budgeted about 11 billion on highways in 2007

If we were not using coal for energy we would not need to move 1.17 billion tons of coal each year to the power plants This is 40% of the freight rail traffic. With the extra freight rail capacity, state and federal governments could use policy to shift truck traffic to the rail capacity that is made available. 4.5 million trucks could be off the road during rush hour.

This is 200 million barrels of oil or about 10 days of total USA oil consumption (from rail efficiency). 3% of total oil consumption. It would be be 16 days of oil imports or 5% of total imported oil. It would reduce the trade deficit (58 billion per month) by about 2-3%.

Note: I recognize that more work policy wise and infrastructure wise would be needed to get the truck traffic shifted even as we reduced the coal usage. So more work would be needed to generate the savings. However, I believe that savings would result. Also, a highly detailed analysis of traffic and freight patterns, specific coal plant locations and actual coal flows would be needed to determine the extent and value of the savings.

Coal supplies 50% of Americas electricity and 20% of California's electricity.

For California: California uses a total of about 294 billion kWh of electricity So the 50 billion kWh coming from coal would need to be replaced.

California's 4 nuclear power plants provided 36.1 billion kWh in 2005 6 more nuclear reactors of the same size would provide 54 billion kWh.
California does not have coal and has to bring it in mainly from Utah This is an average of about 700 miles. It is bringing in about 5-8 million tons of coal (depending upon the year). California also brings in electricity from out of state coal plants The benefit to California would be only 500 million to 1 billion from annual transportation savings, since more of the coal activity is out of state. But that and health benefits and reduced health costs would justify a fast as possible conversion out of coal.

Here is link with the coal usage by region in the United states


The case is strongest for states in the East North Central (231.7 billion tons of coal), South Atlantic (178.9 billion tons of coal), West South Central (152 billion tons of coal), and the East South Central (117 billion tons of coal). The case is also pretty good for the Mountain and Middle Atlantic states. However, as I illustrated even California has a reasonable case in spite of low direct coal usage.

Those states have high coal usage, high population density or high population growth and high tranportation demands.

Here is the 2006 federal transportation analysis. and the 2008 transportation budget.


Highway congestion is worsening. To ease gridlock, the 2006 Budget proposed highway and transit infrastructure spending of $283.9 billion over six years. This marks a 35-percent increase over the TEA 21 six-year spending totals.

For 2001, Amtrak received $520 million in Federal funding. For 2005, Amtrak received $1.2 billion. It requires hundreds of millions of dollars in operating subsidies annually, particularly for its long distance trains, to remain solvent.


The U.S population is expected to grow by approximately 82.2 million people between 2000 and 2030 [I think this is a low estimate]. Seven states (Florida, Georgia, North Carolina, South Carolina, Tennessee, Texas, and Virginia) account for over 47 percent (38.7 million) of the population growth in the United States by 2030. Three southern states (Florida, North Carolina, and Texas) will account for approximately 36 percent of all U.S. population growth. Florida alone will account for 15.5 percent of the U.S. population growth between 2000 and 2030.

Current nuclear plant distribution


Here is a link to a study that shows that good transportation means a good economy

Addendum: More specifics about a proposed way to get rid of coal usage.
If one third of the $268 billion of the federal money for highways was given as energy incentives. Then $89 billion could be used for incentives (About $15 billion per year. Policies would also need to be adjusted to encourage alternatives to coal.
I do not think the following projection of the replacement energy mix and how long it might take should distract from the core idea: Energy and transportation and health policy need to be merged. Coal usage impacts the current and future budgets for transportation and health. So it makes sense to spend some of the transportation and and health money to displace coal and buy better transportation and health by displacing coal.



Coal generates 2 trillion kWh.

GE ESBWR 1.55 GW nuclear plant are $2.48 billion each. Each new plant would generate about 12.6 billion kWh.
There will be 30-34 applications for new nuclear plants under current regulations 2009.
As part of a unified transportation-energy policy more money was spent to eliminate coal usage as part of a shifting of traffic to rail then the number of nuclear plants could be increased and policies and incentives could be made to encourage up-rating existing nuclear plants. Uprating has a 18-24 month approval process.

$60 billion could be assigned to build three nuclear reprocessing plants to handle 2400 tons of waste each year. There would have to be a reversal of any legislation (like Jimmy Carter introduced) which made reprocessing illegal. The cost would be spread over 15-25 years.

Up to 50% power up rates are possible. MIT has research on making donut shaped fuel and adding nanoparticles for a higher operation temperature.

Wind power could also be given incentives to increase the 29% growth rate. 30 GW (total 68 billion kWh or an added 32 billion Kwh by 2010).

Policy and monetary incentives should be created to switch to more efficient superconducting motors that will be available in 2010 They could save about 34 billion kWh.

Over 20 years, a replacement energy mix scenario is:
80 new 2GW nuclear plants (up-rated 1.55 GW reactors) with 16.25 billion kWh each. 1300 billion kWh
200 billion kWh from up-rating of existing nuclear reactors
400 billion kWh from wind
200 billion kWh from solar
34 billion kWh from superconductor motor industrial efficiency

May 04, 2007

The brain scan that can read people's intentions

A team of world-leading neuroscientists has developed a powerful technique that allows them to look deep inside a person's brain and read their intentions before they act. The team used high-resolution brain scans to identify patterns of activity before translating them into meaningful thoughts, revealing what a person planned to do in the near future. It is the first time scientists have succeeded in reading intentions in this way.

During the study, the researchers asked volunteers to decide whether to add or subtract two numbers they were later shown on a screen.

Before the numbers flashed up, they were given a brain scan using a technique called functional magnetic imaging resonance. The researchers then used a software that had been designed to spot subtle differences in brain activity to predict the person's intentions with 70% accuracy.

Brain computer interfaces and brain implants



Following up on my article about cheap brainwave sensors I will look at more advanced brain computer interfaces and brain scanning technology.


A chip connected to live rat brain tissue

Ted Berger, USC scientist, has demonstrated a computer chip able to converse with live cells. It is a step towards an implantable machine that fluently speaks the language of the brain—a machine that could restore memories in people with brain damage or help them make new ones.

Within four years, the team aims to wire a chip beneath the skulls of monkeys, whose brains are even closer to humans. Berger predicts that human trials of a prosthetic device that can actually replace impaired memory cells are less than 15 years away.

Wikipedia covers all of the other methods for Brain computer interfaces

Brain implants are reviewed here

Neuroprosthetics are covered here

Functional magnetic resonance (fMRI) imaging has been used as game controller for Pong

IBM has developed MRI with 60,000 times better resolution

The BrainGate™ Neural Interface System is currently the subject of a pilot clinical trial to allow handicapped people to control robot arms and computers

Nanowires have been connected to neurons, which could allow far more connections for a next generation BrainGate

Benefits of using more trains instead of trucks for freight

Consultant Wendell Cox shows shifting 25 percent of the goods moved by truck with trains could save commuters more than 40 hours per year by reducing traffic, save $44 billion in fuel, and could also take 3 million trucks off the road during the rush hours across the country. Depending on the type of train, between 280 to 500 trucks can be eliminated per trip. This would also save on the wear and tear on the rodways and reduce the number of accidents as well.

40% of rail freight is for the movement of 1.17 billion tons of coal each year in the USA. Therefore, by switching from coal power to say nuclear power then would provide the capacity to save about $70 billion in fuel by providing more rail capacity to supplant trucks.

There is also the benefit of $20 billion per year in health cost savings.

Cleaning about half of the coal air pollution would save about $9 billion/year in health costs. Complete elimination reduces the air pollution from coal and reduces the air pollution from the rail and trucks to move the coal and dig it up and the damage from blowing up mountain tops to get at the coal.


there is a global boom in coal usage.

2008 Presidential Candidates and nuclear power

Pretty much all of the 2008 candidates for president are favorable to nuclear power

Each of the top contenders for the Republican nomination and all but one of the major Democratic hopefuls support nuclear power to some extent. Most cite the prospect that atomic energy could help reduce climate change by supplanting power produced by fossil fuel sources such as coal and natural gas.

The two leading Democratic presidential candidates, Senators Clinton and Obama, have joined one of the top Republicans in the race, Senator McCain of Arizona, to sponsor the Climate Stewardship and Innovation Act of 2007. The measure includes more than $3.6 billion in funding and loan guarantees for the planning and construction of nuclear plants using new reactor designs.

The only major candidate opposed to increased reliance on nuclear power is a former senator from North Carolina, John Edwards.

Mrs. Clinton's has an open embrace of nuclear power in the current campaign.

"I think nuclear power has to be a part of our energy solution," the New York senator said during a town hall meeting in Aiken, S.C., in February. "We've got to be very careful about the waste and about how we run our nuclear plants, but I don't have any preconceived opposition. I just want to be sure that we do it right, as carefully as we can because obviously it's a tremendous source of energy. We get about 20% of our energy from nuclear power in our country. … Other countries like France get, you know, much, much more. So, we do have to look at it because it doesn't put greenhouse gas emissions into the air."

Mr. Obama's camp gave a somewhat more reserved answer when asked about the Illinois senator's views on atomic energy. " Barack Obama feels we must address three key issues before ramping up nuclear power, including the public's right to know, security, and waste storage," a campaign spokeswoman, Jennifer Psaki, said. "Nuclear power represents the majority of non-carbon generated electricity therefore making it unlikely that it will be taken off the table."

One critical part of the nuclear calculus for Democrats these days is the negative sentiment of Nevada residents to the federal government's plan to store high-level nuclear waste at a site there known as Yucca Mountain. The clout of Nevada voters is magnified in this cycle by plans to stage the state's Democratic presidential caucuses on January 19, 2008, prior to New Hampshire's primary.

The four senators in the Democratic race also have another good reason not to get crosswise with Nevadans: the Senate majority leader, Harry Reid, hails from that state.

Rudy Giuliani supports the building of new nuclear energy and energy independence

John McCains energy platform is pro-nuclear

McCain said U.S. should build more nuclear plants, which emit no greenhouse gases, after a 25-year building hiatus. "The barriers to nuclear energy are political, not technological," and political squabbles over where to store spent radioactive fuel has "made it virtually impossible to build a single new plant," he said.

Shifts to support nuclear power and to avoid increased climate change deaths, illness and costs

The Asian Development Bank may end its long-standing rejection of nuclear energy and embrace it as a green power source for rapidly expanding Asia, the bank's energy chief said Friday.

The ADB, which was founded four decades ago to fight poverty through economic growth, has a standing policy of not advocating atomic power out of concerns of safety and possible conversion to weapons use.

But under increased pressure to promote alternatives to the fossil fuels that fan global warming, the ADB is considering the use of nuclear power under a new energy policy to be adopted in three months, WooChong Um, ADB director of energy, told The Associated Press in an interview at the ADB's annual meeting.


Global warming could lead to a return of insect-borne diseases in Britain such as malaria, and increased incidence of skin cancer caused by exposure to the sun, a UK government report warns today

The nuclear industry hails the IPCC report (pdf for policymakers). The report from scientists with pro-renewable energy biases also admits the necessary and beneficial role of nuclear power.

Climate change and the rise in allergies and asthma have been linked

20 million suffer from asthma. Even though the air in many cities is much cleaner than in the past, the prevalence of hay fever has increased in the U.S. over the past few decades. In 2004, asthma affected more than 6 percent of the American population, up from a little over 3 percent in 1980, according to the U.S. Centers for Disease Control and Prevention in Atlanta.
Childhood asthma is increasing at an even faster rate. The percentage of children with asthma jumped to 9 percent in 2005 from 3.6 percent in 1980, according to the CDC.
In 2004, a Harvard Medical School study linked the childhood asthma "epidemic" among inner-city youth to climate change. Stating that higher carbon dioxide levels in cities promote pollen production in plants, fungal growth, and opportunistic weeds, the study noted that asthma among preschool children grew 160 percent between 1980 and 1994, more than double the increase for the overall U.S. population.

May 03, 2007

Nanoparticles are 400% more active Catalysts : could make cheaper Fuel Cells

Nanoparticles with a completely new shape may lead to cheaper catalysts that could make many experimental-energy technologies more practical.


Nano geometry: This 24-sided platinum nanoparticle could lead to cheaper alternative energy. Credit: Zhong Lin Wang, Georgia Tech


The 24-sided platinum nanoparticles have surfaces that show up to four times greater catalytic activity compared with commercial catalysts.

If researchers can make even smaller nanoparticles with this same efficient shape, it could significantly reduce the amount of platinum used. Reducing the amount of this expensive metal--it currently sells for about $1,300 per ounce--would make applications such as fuel cells more affordable. Reducing the cost of platinum catalysts could also be critical in other applications, such as synthesizing alternative fuels and converting waste materials like carbon dioxide into useful products.

The new work is important, says Francesco Stellacci, professor of materials science and engineering at MIT, because it involves platinum, which he says is "by far the most interesting metal" for catalysis. The work could also advance the basic understanding of how changing the shape of particles affects catalysis, he says.
professor of chemical engineering at the University of California, Berkeley, says that while the work is interesting because it addresses one of the particular challenges of creating catalysts--controlling the surface structure--the new nanoparticles are in fact not small enough. Existing commercial platinum catalysts can be less than five nanometers wide. The Georgia Tech and Xiamen researchers made particles between 50 and 200 nanometers. Being larger, the new type of nanoparticles have a larger proportion of the expensive platinum locked beneath the surface, where it can't serve to catalyze reactions. As a result, for now, the new nanoparticles are actually worse catalysts than are commercial catalysts available today.

According to Wang, the goal is ultimately to use the new nanoparticles and the methods for making them to help find ways of transforming much cheaper materials into useful catalysts. If that can be done, some technologies limited to the lab bench today could be applied to meeting growing worldwide energy needs.

Indeed, notes Daniel Feldheim, professor of chemistry and biochemistry at the University of Colorado at Boulder, in a commentary accompanying the Science article, researchers have long known that changing particle shape and size can make even seemingly inert materials such as gold into valuable catalysts. The methods used by the Georgia Tech and Xiamen researchers, Feldheim says, provide a new level of control that could lead to improved mixed-metal and metal-oxide catalysts, which are cheaper than precious metals such as platinum.

Mice experiments offer possible Alzheimer's treatment

The Gladstone Institute of Neurological Disease in San Francisco has found that slashing the tau protein, which regulates the internal brain skeleton, can prevent seizures, memory loss and defects related to Alzheimer's disease.

The finding could lead to complementary treatments for the most common form of dementia, researchers found.

"It appears that reducing tau has a protective effect on the brain," said Lennart Mucke, director of the Gladstone Institute, which led the study.

Researchers cut tau production in mice brains in half by inactivating one gene that produces the protein. In other mice, all tau production was eliminated by inactivating both genes.

Even in mice where tau production was reduced, mice genetically engineered to develop Alzheimer's lived a normal lifespan and retained their memory function.


Defeating Alzheimer's and dementia is especially important because if there is radical life extension via the defeat of heart disease and cancer and the creation of effective rejuvenation then keeping the mind fit will be critical

Solar cells make OLEDS luminous enough for mobile devices

Organic light-emitting diodes (OLEDs) offer an alternative to conventional electronic displays (such as LCDs) and have advantages generally including wide viewing angles, rapid response and thin shapes. However, one area that scientists would like to improve is achieving better contrast, especially under strong lighting environments. A low-power, high-contrast OLED could provide higher quality displays for mobile electronics, and also result in longer battery lifetimes.

When the scientists placed a solar cell in the back of the OLED, the solar cell absorbed the incident light and internal OLED emission and then converted the light to electrical power for reuse via photovoltaic action. Although the power recycling efficiency achieved in this trial was a modest 0.26%, the scientists explained that there is much room for improvement beyond this initial demonstration, simply by using more efficient OLEDs and solar cells.

“The solar cell stack put behind the OLED actually play two roles,” said Wu. “First, it functions as a black absorbing material. Second, it also functions as an optical coating to induce destructive interference of incident ambient light, so that the ambient light reflection can be much suppressed.”

Overall, this arrangement could reduce the reflectance of the OLED from 70% in a conventional OLED to about 1.4%, without compromising the electroluminescence efficiency. The device even improves upon the OLED polarizer approach, which has a reflectance of about 5%. By getting the reflectance down to that level, the scientists are helping to prepare OLEDs to be highly competitive with current light displays.


OLED costs are higher than high volume LCDs but they are catching up.

Ray Kurzweil's AI powered Hedge Fund

Ray Kurzweil is running a Hedge fund guided by an artificial intelligence program. Billionaire, Vinod Khosla is a very happy investor in the fund.

Canada making pro-nuclear moves

Canada making stronger pro-nuclear moves I believe this is a good thing.

In Ontario, the Liberal government's controversial electricity plan, tabled last June, calls for two new reactors and the refurbishing of old ones, projects expected to cost up to $40 billion over two decades. John Tory, the provincial Conservative leader, is calling for more new nuclear plants faster, accusing Premier Dalton McGuinty of downplaying the need out of fear of the anti-nukes backlash. Ontario's program is central to federal plans. Although the provincial Liberals have expressed a preference for sticking with Canadian technology, they haven't ruled out going to one of AECL's French or U.S. rivals if the price was better. Lunn has declared it "imperative" that the province buy its new reactors from AECL.


The nuclear power plants for Alberta's oilsands:
Henuset's optimum timeline: secure regulatory approval within four years, start construction in 2011, throw the switch to begin using nuclear power to separate sand from up to 500,000 barrels of oil a day in 2016. "We've got the federal government onside, the provincial government onside, and two local communities that want us," he says. Lunn has predicted it's only a matter of time before nuclear reactors begin playing "a very significant role in the oil sands."

AECL executives created "Team CANDU," an alliance with big private companies, which Lunn duly applauded, saying the participation of players like Hitachi and SNC-Lavalin boosts his confidence that any future Canadian reactors projects will be completed without any risk that taxpayers would be on the hook for cost overruns. Second, last fall, AECL struck its deal with Energy Alberta to push the oil sands concept that carries such obvious appeal for Harper and his Alberta base.

It remains unclear how aggressive the Conservatives will be about openly touting nuclear power as a core element of their climate-change strategy.


I would note that the IPCC report indicating a pro-nuclear position could help countries like Canada to shift to a more openly pro-nuclear position.

"More than two-thirds of Canada's coal-fired generating capacity will need to be replaced by 2020 and more new generating capacity will be required," says the draft. "Some $150 billion in capital investments will need to be made." Options for investing in new generating capacity that won't spew CO2 are, to say the least, limited.

Environmental groups call for unprecedented investments in renewable sources like solar and wind. But the Conservatives make little or no distinction between nuclear power and those so-called "soft" renewables.


It would be fantastic if Canada were to replace coal power with nuclear power.

Foreign markets also beckon. Some observers expect China to build as many as 40 reactors in the next two decades. Westinghouse has locked down the first piece of that huge expansion, and there was fear AECL might be frozen out. But Lunn said he and other cabinet ministers worked during visits to China to persuade the Chinese to put AECL back in their plans. "They have now said they are open to CANDU technology," he said. AECL is trying to build on a track record, having delivered two reactors in China in 2002 and 2003, both on budget and ahead of schedule.

Britain might buy up to four of its reactors as it adds up to 12,000 megawatts of generating capacity in the next 20 years.

Polling by Ipsos Reid for the Canadian Nuclear Association found that support for nuclear power has risen over two years to 44 per cent from 35 per cent nationally, and jumped to 63 per cent from 48 per cent in the key battleground of Ontario.



Expanding renewables is also part of the energy which is good too.
Ontario is banking on doubling the electricity it draws from renewables by 2025 to 15,700 megawatts. That would outstrip nuclear power's projected 14,000-megawatt contribution under the McGuinty plan, up from about 11,400 megawatts - or half the province's electricity - today. Ottawa has no say in how the provinces plan for supplying their power needs, and Lunn is careful not to impose. "It is absolutely essential," he says, "that provinces make their own choices about energy mix."

But he also expresses informed admiration for a perhaps unexpected model - France, where 58 reactors supply 80 per cent of the country's electricity. "They have the cleanest air shed of all the industrialized countries," Lunn notes. "They made this decision 20 years ago, ahead of their time, and it has proven to be very successful."

Computer analysis of metamaterial invisibility

A unique computer model designed by a mathematician at the University of Liverpool has shown that it is possible to make objects, such as aeroplanes and submarines, appear invisible at close range.

Scientists predict that metamaterials could be of use in military technology, such as in the construction of fighter jets and submarines, but it will be some years before invisibility cloaks can be developed for human beings.

Using this new computer model we can prove that light can bend around an object under a cloak and is not diffracted by the object. This happens because the metamaterial that makes up the cloak stretches the metrics of space, in a similar way to what heavy planets and stars do for the metrics of space-time in Einstein's general relativity theory.

"In order for the cloaking device to work in the first place light has to separate into two or more waves resulting in a new wave pattern. Within this pattern we get light and dark regions which are needed in order for an object to appear invisible.

"Until now, however, it was not clear whether photons -- particles that make up all forms of light -- can split and form new waves when the light source is close to the object. If we use ray optic techniques -- where light travels in beams - photons break down at close range and the object does not appear invisible. If we study light as it travels in waves however, invisibility is maintained."

Scientists predict that invisibility will be possible for objects of any shape and size within the next decade. The research findings are published in Optic Letters.

NEC, JST and RIKEN successfully demonstrate world's first controllably coupled qubits

More competition for scalable quantum computer solutions.

UPDATE: Dwave scientific paper that they achieved this goal first
Dwave's Geordie Rose indicates that the 16 qubit demo machine had 42 such couplers.

NEC Corporation, Japan Science and Technology Agency (JST) and the Institute of Physical and Chemical Research (RIKEN) have together successfully demonstrated the world's first quantum bit (qubit) circuit that can control the strength of coupling between qubits. Technology achieving control of the coupling strength between qubits is vital to the realization of a practical quantum computer, and has been long awaited in the scientific field.


NEC, JST, and RIKEN have already announced successful development of key technologies for the world's first solid-state qubit and the world's first two-qubit logic gate, based on solid-state technology that excels in its ability to integrate qubits. Following these achievements, the research group addressed the controllable coupling of qubits as the next logical step in realization of a practical quantum computer. Their new research result represents the world's first successful demonstration of controllably coupled qubits.

To date, the coupling of qubits has been difficult to control. In order to realize this control, the research group devised an original mechanism that employs another qubit in between the two qubits for coupling. The coupling qubit functions as a non-linear transformer that is able to turn on and off the magnetic coupling between the two qubits, and on/off control is achieved simply by inputting a microwave. Moreover, coupling operation has been achieved without shortening the lifetime of each qubit. Scalability is also realized through the repetition of coupled two-qubit units - a feature necessary for future quantum computers.

To demonstrate the operation feasibility of the controllable coupling scheme, the research group employed a coupled two-qubit system, the smallest quantum logic unit, to carry out a multi-quantum control experiment involving the turning on and off of the coupling. As a result, a simple quantum protocol has been successfully demonstrated, allowing controllable coupling for the execution of quantum algorithms.

In the near future, NEC, JST, and RIKEN, plan to implement a larger-scale, more elaborate quantum computation, aiming for the realization of a practical quantum computer.

May 02, 2007

Self assembly used in standard computer chip making by 2009

Self assembly will be used in some steps of standard computer chip making process by 2009. They will self assemble air-gap insulators that can increase the speed of a chip by 35 percent or allow it to consume 15 percent less power than chips without the air-gap insulator. The company expects that the new process will be implemented in semiconductor facilities by 2009.


This microprocessor cross section shows empty space in between the chip’s copper wiring. Wires are usually insulated with a glasslike material, but IBM has used self-assembly techniques, which can be employed in chip-making facilities, to create air gaps that insulate the wires.
Credit: IBM


The new self-assembly approach ushers in to chip making an era of nanotechnology, says Daniel Edelstein, IBM fellow and chief scientist for the self-assembly air-gap project. Importantly, Edelstein says, IBM's process is designed to be compatible with current manufacturing facilities and materials.

IBM researchers used a new type of polymer to help them create the air gaps. The polymer is poured onto copper wires that are embedded in an insulating material. When the polymer is heated, the molecules pull away from each other to form a regular array of nanoscale holes. These holes are used as a template to etch hollow columns into the insulating material that surrounds the wires. Engineers then pump plasma, an electrically charged gas, through the holes to blast away the remaining insulating material. A quick chemical rinse leaves behind clear gaps of air on either side of the copper wires.

IBM's Edelstein says that because the new process adds manufacturing steps to the overall chip-making process, there will be a slight increase in cost. There are 10 layers of wiring in a chip, and he estimates that the cost will increase 1 percent per layer.

Revolutionizing resuscitation



Perfecting new procedures could enable the safe resuscitation of someone with no heart beat and no brain activity even after one hour or more.

In an article from Newsweek, researchers who analyzed the cells of people who have no heart beat and whose brain has shutdown to conserve oxygen (normally considered clinically dead) have made a remarkable discovery. According to Dr. Lance Becker, an authority on emergency medicine at the University of Pennsylvania. "After one hour," he says, "we couldn't see evidence the cells had died. We thought we'd done something wrong." In fact, cells cut off from their blood supply died only hours later.

But if the cells are still alive, why can't doctors revive someone who has been dead for an hour? Because once the cells have been without oxygen for more than five minutes, they die when their oxygen supply is resumed. It was that "astounding" discovery, Becker says, that led him to his post as the director of Penn's Center for Resuscitation Science, a newly created research institute operating on one of medicine's newest frontiers: treating the dead.

With this realization came another: that standard emergency-room procedure has it exactly backward. When someone collapses on the street of cardiac arrest, if he's lucky he will receive immediate CPR, maintaining circulation until he can be revived in the hospital. But the rest will have gone 10 or 15 minutes or more without a heartbeat by the time they reach the emergency department. And then what happens? "We give them oxygen," Becker says. "We jolt the heart with the paddles, we pump in epinephrine to force it to beat, so it's taking up more oxygen." Blood-starved heart muscle is suddenly flooded with oxygen, precisely the situation that leads to cell death. Instead, Becker says, we should aim to reduce oxygen uptake, slow metabolism and adjust the blood chemistry for gradual and safe reperfusion.

Mini DNA replicator

From the New Scientist, a pocket-sized device that runs on two AA batteries and copies DNA as accurately as expensive lab equipment has been developed by researchers in the US.


The DNA-copying device runs off two AA batteries (Image: Victor Ugaz/Angewandte Chemie)


The device has no moving parts and costs just $10 to make. It runs polymerase chain reactions (PCRs), to generate billions of identical copies of a DNA strand, in as little as 20 minutes. This is much faster than the machines currently in use, which take several hours.

Running a PCR requires treating DNA strands, along with chemical materials needed to make new DNA strands, at three different temperatures. The highest temperature (95°C) causes two strands of a DNA molecule to separate. The lowest temperature (60°C) makes DNA building blocks stick together. Then, holding the temperature in the middle (72°C), allows an enzyme to quickly assemble replica DNA strands.
Facsimile machine

To cycle through these temperatures, a conventional PCR machine heats and cools a large metal block holding multiple tubes containing samples of DNA and the material needed to make copies.

In the new device, created by graduate student Nitin Agrawal, a centimetre-wide loop of tubing wraps in a vertical ring around a set of three metal rods. The rods, together the size of an AA battery, are kept at three different temperatures. With this set-up, the parts of the tube closest to each block are heated differently.

This keeps the liquid flowing through the millimetre-wide tube, and so the DNA and building blocks cycle automatically through the three temperatures needed for PCR. "It's similar to how a lava lamp works," says Ugaz.

The device shows promise for a variety of tests, Sia says, including monitoring levels of HIV virus in a person's body or diagnosing tuberculosis. "There's nothing like this in developing countries," he explains. "There's a great need everywhere in the world for doing DNA- and RNA-based tests."

For the full potential of the device to be realised, however, Sia says that cheap and simple methods of preparing samples, by isolating DNA from cells, will be needed along with miniaturised DNA analysis equipment.

IPCC draft recommends more nuclear power

From the BBC, Intergovernmental Panel on Climate Change (IPCC) draft report suggests solutions to mitigate climate change, such as capturing and burying emissions from coal-fired power plants, shifting to renewable forms of energy and more use of nuclear power.


The findings of the report will be used by governments and international organisations to map out their own plans for climate change mitigation.

"The IPCC plays an incredibly important role in the political negotiations so people can point and say 'Look, this is what is going to happen in 50 years, these are the options available for us to take actions'," said UN Environment Programme spokesman Michael Williams.

The report's conclusions will play a key role in negotiations on the Kyoto Protocol, which will take place in December on the Indonesian island of Bali.

It will also influence world leaders when they meet face-to-face over climate change at the summit of the group of eight most industrialised nations (G8) in June.


Hat tip to the we support lee blog

New lab-on-a-chip device to speed proteomics research

The genomics era is now making way for the era of proteomics – the study of the proteins that genes encode. Future proteomics research should see a substantial acceleration with the development of a new device that provides the first monolithic interface between mass spectrometry and silicon/silica-based microfluidic "lab-on-a-chip" technologies. This new device, called a multinozzle nanoelectrospray emitter array, was developed by scientists with the DoE’s Lawrence Berkeley National Laboratory.


This zoom-in Scanning Electron Microscope image shows a five-nozzle M3 emitter, where each nozzle measures 10x12 microns. Credit: Daojing Wang, Lawrence Berkeley National Laboratory

Each emitter consists of a parallel array of silica nozzles protruding out from a hollow silicon sliver with a conduit size of 100 x 10 microns. Multiple nozzles (100 nozzles per millimeter was a typical density) were used rather than single nozzles in order to reduce the pressure and clogging problems that arise as the microfluidic channels on a chip downsize to a nanometer scale. The emitters and their nozzles were produced from a silicon wafer, with the dimension and number of nozzles systematically and precisely controlled during the fabrication process. Fabrication required the use of only a single mask and involved photolithographic patterning and various etching processes.

Said Peidong Yang, "Once integrated with a mass spectrometer, our microfabricated monolithic multinozzle emitters achieved a sensitivity and stability in peptide and protein detection comparable to commercial silica-based capillary nanoelectrospray tips. This indicates that our emitters could serve as a critical component in a fully integrated silicon/silica-based micro total analysis system for proteomics."

Added Daojing Wang, "This is also the first report of a multinozzle emitter that can be fabricated through standard microfabrication processes. In addition to having lower back pressure and higher sensitivity, multinozzle emitters also provide a means to systematically study the electrospray ionization processes because the size of each nozzle and density of nozzles on the emitters can be adjusted."

According to Wang and Yang, the fabrication and application of the microfabricated monolithic multinozzle emitters, called "M3 emitters" for short, could be commercialized immediately and should be highly competitive with current silica capillary emitters in terms of cost and mass production.

90% pure quantum dots for better solar cells made

Rice University scientists today revealed a breakthrough method for producing molecular specks of semiconductors called quantum dots, a discovery that could clear the way for better, cheaper solar energy panels. One way towards cheaper solar cells is to make them out of quantum dots. Prior research by others has shown that four-legged quantum dots, which are called tetrapods, are many times more efficient at converting sunlight into electricity than regular quantum dots. The best previous method produced 30 percent of particles as tetrapods, while the new method makes 90% tetrapods.

Significantly, these tetrapods are made of cadmium selenide, which have been very difficult to make, until now. The essence of the new recipe is to use cetyltrimethylammonium bromide instead of the standard alkylphosphonic acid compounds. Cetyltrimethylammonium bromide happens to be safer – it's used in some shampoos, for example – and it's much cheaper than alkylphosphonic acids. For producers looking to eventually ramp up tetrapod production, this means cheaper raw materials and less purification steps, Wong said.

"One of the major bottlenecks in developing tetrapod-based solar cell devices has been removed, namely the unavailability of high-quality tetrapods of the cadmium selenide kind," Wong said. "We might be able to make high-quality nanoshapes of other compositions also, using this new synthesis chemistry."

HP licensing nanoimprint lithography

HP today announced it is licensing nanoimprint lithography (NIL) technology that could enable the fabrication of semiconductor chips significantly more powerful than those available today. NIL allows the stamping out patterns of wires less than 50 atoms (5 nanometers) wide on a substrate versus current 45 nanomater lithography.

Discovery of first gene that specifically links calorie restriction to longevity

Loss of only one of the genes, a gene encoding the protein PHA-4, negated the lifespan-enhancing effect of calorie-restriction in worms. And, when researchers undertook the opposite experiment—by overexpressing pha-4 in worms—the longevity effect was enhanced. “PHA-4 acts completely independent of insulin/IGF-1 signaling and turns out to be essential for CR-mediated longevity,” says Panowski.

“We know three distinct pathways that affect longevity: insulin/IGF signaling, calorie restriction, and the mitochondrial electron transport chain pathway, yet it is still not clear where sir-2 fits in. It seems to meddle with more than one pathway,” says Dillin and adds that “PHA-4 is specific for calorie restriction as it does not affect the other pathways.”

Humans possess three genes highly similar to worm pha-4, all belonging to what is called the Foxa family. All three play an important role in development and then later on in the regulation of glucagon, a pancreatic hormone that unlike insulin increases the concentration of blood sugar and maintains the body’s energy balance, especially during fasting.

The potential payoff for cutting to 60 percent of normal while maintaining a healthy diet rich in vitamins, minerals, and other nutrients, is huge. Currently it is the only strategy apart from direct genetic manipulation that consistently prolongs life and reduces the risk of cancer, diabetes, and cardiovascular disease, while staving off age-related neurodegeneration in laboratory animals from mice to monkeys. Although some people are already imposing this strict regimen upon themselves, it is too early tell whether calorie restriction will have the same effect in humans.



Other research shows that starting calorie restriction later in life still has health benefits

Claytronics : programmable grit, steps toward utility fog



Intel is working on Claytronics an emerging field of engineering concerning reconfigurable microscale robots The researchers propose to make moving, physical, three-dimensional replicas of people or objects, so lifelike that human senses would accept them as real. When you finished using a replica for one purpose, you could transform it into another useful shape. A human replica could morph into a desk or a chair. This would be a step towards utility fog and systems for synthetic reality.

Five years from now, the DPR researchers expect to have working ensembles of catoms that are close to spherical in shape. These catoms still will be large enough that no one will confuse a replica with the real thing (for that, catoms will probably have to shrink to less than a millimeter in diameter). But the catoms will be sufficiently robust that researchers can experiment with a variety of shapes, test hypotheses about ensemble behavior, and begin to envision where the technology might lead within a decade or two.

Carnegie Mellon has been researching this field and is working with Intel research in Pittsburgh.

Intel refers to the field as Dynamic Physical Rendering





The basic unit is catom. Researchers have already created a prototype catom that is 44 millimeters in diameter. The goal is to eventually produce catoms that are one or two millimeters in diameter-small enough to produce convincing replicas.

Cost is an issue: dynamic physical rendering could become viable long before Moore's Law drives down the cost of a catom to a micro cent. Even if catoms could be produced for a dollar each, some visualization applications might be economically viable. Certain other applications, such as programmable antennas, could be attractive even if a catom sold for tens or hundreds of dollars.

the May 2007 issue of Business 2.0 has an article on Claytronics.

The catoms would move and configure themselves using electrostatic and electromagnetic forces.


Diagram of a 100 micrometer foglet. This would have 1000 times less volume than a 1 millimeter spherical catom. Still Claytronics would be a significant step in the direction of utility fog.

May 01, 2007

Gene Therapy status



Gene therapy has made it through to the commercial mainstream.

according to figures published by the Journal of Gene Medicine in January 2007, there are 1,260 gene therapy clinical trials in progress. Of these, 27 have reached Phase III and 13 Phase II/III, showing that a number of products are edging close to market.

The gene therapy paradigm has evolved since the original concept of using introduced genes to replace ones that are defective or missing. Products in development are delivering a range of genes, with a range of therapeutic objectives.

This evolution of the technology means that treating inherited gene defects is no longer the leading objective. Of the 1,260 gene therapy trials currently in progress, 67 percent are in cancer, while inherited single gene defects account for 8.4 percent.

The trials have shown that gene therapy can provide significant clinical benefits with good safety profiles in hitherto difficult, or impossible, to treat diseases. TroVax, for example, has been in nine studies involving more than 180 patients. There have been no serious adverse events and more than 90 per cent of patients mounted an immune response. This has resulted in high levels of tumour shrinkage and indications of survival benefit, with a correlation between the level of clinical benefit and the strength of the immune response.


Another high profile gene therapy trial is to cure blindness

The work is at an early stage aimed at establishing the safety and efficacy of gene transfer to the eye, and that gene therapy for many common conditions including macular degeneration, which affects about 500,000 mainly elderly people in Britain, was many years away.

Professor Ali, the head of the division of molecular therapy at UCL's Institute of Ophthalmology said: "If we can establish the technique of delivering genes to the retina, it paves the way for applying it to other inherited disorders for which there is currently no treatment. Then in the longer term that could open the way to the treatment of common conditions such as macular degeneration, for which there are treatments but which aren't particularly good.

"The advantage of using gene therapy over drugs is that you can give it as a single treatment to the back of the eye, avoiding the need for repeated use of drugs. We anticipate the gene transfer is life long."


There is also a pill that activates a fat burning gene

This therapy may not only keep people slim in the future. Other studies suggested it may also help raise good cholesterol, lower bad cholesterol, and help ward off type-2 diabetes.


This article looks at the inevitability of gene therapy for performance enhancement and odds of severe complications as currently being two or three patients out of 1,100.

Metagenomics project to sequence all the microbes of our bodies

The bacteria in the human body are very difficult to study, since only about 1 percent of them can be grown in the lab. Now a proposed new project to sequence all our microbial residents could change that. The human body has 10 times as many microbial cells as human cells. They're a vital part of our health, breaking down otherwise indigestible foods, making essential vitamins, and even shaping our immune system.

Thanks to ever-improving methods to sequence DNA, scientists can now analyze the genomes of entire microbial communities, a field known as metagenomics. By comparing microbial communities in people of different ages, origins, and health statuses, researchers hope to find out precisely how microorganisms prevent or increase risk for certain diseases and whether they can be manipulated to improve health.

Several metagenomics projects are under way or have been completed, including analysis of the microbes living in the human gut and on the skin. But a true snapshot of our microbial menagerie will require a massive effort, along the lines of the Human Genome Project. "Even though a microbial genome is one-thousandth the size of the human genome, the total number of microbial genes in [the human] body is much greater than human genes because you have so many different species," says Weinstock.

The National Institutes of Health (NIH) is now considering such a project. Metagenomics experts and government officials met last week to determine if the proposal, dubbed the human microbiome, will become an NIH "Roadmap" initiative. These NIH-wide programs identify major gaps in biomedical research and provide financial support on a much larger scale than typical grants. A final decision is expected this month.


Five topics that have been selected to be developed for further consideration as Major NIH Roadmap Initiative Proposals:


* Microbiome – The Microbiome is the full collection of microbes (bacteria, fungi, viruses, etc.) that naturally exist within the human body. Initiatives in this area would focus on developing a deeper understanding of these communities of microbes in order to determine how they affect human health.
* Protein Capture/Proteome Tools – The Proteome is the complete set of proteins in the body. Efforts in this area would support developing and making available to the scientific community high quality probes specific to every protein in the human and in desired animal models. This would allow the ability to characterize protein function in health and disease and to monitor the markers of a disease in order to deploy early prevention efforts and to identify potential therapeutic targets.
* Phenotyping services and tools – A human Phenotype is the total physical appearance and constitution of a person, often determined by multiple genes and influenced by environmental interactions. Initiatives in this area would encourage the development of resources to systematically catalog human phenotypes in an effort to characterize complex diseases and disorders.
* Inflammation as a common mechanism of disease – While significant breakthroughs have occurred in our understanding of inflammation, research is needed to further understand inflammatory processes. Because inflammation is broadly implicated in many diseases and conditions, this initiative would be valuable in uncovering as-yet-unknown immune mechanisms and mediators of inflammation as well as genetic factors, environmental triggers, and the relationship of inflammation to disease.
* Epigenetics – Epigenetics is the study of stable genetic modifications that result in changes in gene expression and function without a corresponding alteration in DNA sequence. The epigenome is a catalog of the epigenetic modifications that occur in the genome. Epigenetic changes have been associated with disease, but further progress requires the development of better methods to detect the modifications and a clearer understanding of factors that drive these changes.


Regenerative Medicine is being placed into more information gathering to determine next steps.

Heart deaths nearly halved from 1999 to 2005

The researchers said these marked improvements are probably a “direct consequence” of new practices that followed updated guidelines from key organizations of heart doctors in the United States and Europe.

Recommendations in those guidelines include quick use of aspirin or more potent blood thinners; beta blockers to reduce the damaged heart’s oxygen needs, statins to lower cholesterol; ACE inhibitors to relax blood vessels; and angioplasty to open blocked vessels soon after hospital arrival. 85 percent of heart patients studied got cholesterol drugs in 2005 versus just 37 percent in 1999; 78 percent got potent blood thinners including Plavix versus 30 percent in 1999; and 53 percent had quick angioplasty, compared to just 16 percent six years earlier.

This is related to an earlier article that bodily cell, organs and tissue rejuvenation combined with the elimination of 50% of deaths from disease would lead to lifespans of 151 years. 90% disease elimination and successful rejuvenation therapies would lead to lifespans of 512 years.

Clear progress is being made against disease such heart disease and cancer and some progress is being made towards rejuvenation and regeneration.

Robert Freitas has finished his latest theoretical scaling study of a new diamondoid medical nanorobot called the "chromallocyte". This is the first full technical description of a cell repair nanorobot ever published.

Progress is being made on more crude yet possibly effective cellular repair systems

Nanofibers offers hope for repairing spinal cord damage, Parkinson’s and Alzheimer’s

Alzheimer’s, Parkinson’s, Type II Diabetes Are Similar at the Molecular Level.
Exercise and diet are very helpful in preventing Type II diabetes Current lifestyle recommendations for Type II diabetes

Assessing molecular electronics, graphene plasmon chips and future computing

The Eetimes recently discussed another candidate technology for molecular electronics. Monomeric phthalocyanines could be organized into one dimensional wire-like ring-stacked or two dimensional sheet-like ring-fused phthalocyanines to make fully reversible quantum switch with multiple outputs.

CTT, the company that bought the rights to the technology, is only a 27 million market cap company.
Although they have been as high as 150 million. CTT has about 10 million in cash and has about 900K in profit per year. So they are like a largish angel or a small VC. They were stronger back in 2004 and 2005 and probably better at other times in the past.

I looked through the patents
It is some kind of electro-optical switch based on molecular ring chains.


There have been many high potential molecular electronics proposals and projects but none have been successfully implemented commercially.

This technology promises low energy petaflop performance and high memory density. This would be good but not if it does not take 15-20 years to get there. Since it looks like it can only deliver on its promise with a full blown transition to molecular computing that looks like something distant. (At least 10+ years and looks like it would take longer and not be as good as graphene chips.) I remember James Ellenbogen talks about molecular electronics and that the near term way to start getting results would be to create hybrids compatible with todays electronics.

James Ellenbogen. He was quite public with his molecular electronics work at Mitre and had spoken at the Foresight institute for Nanotechnology. (A recent publication of his.

James Tour and others had done work and formed a company, which I believe got folded into Nanosys Inc. I am unclear where Nanosys is at since they have gone very quiet news wise since an attempted IPO which was withdrawn.

Another molecular electronics candidate described at physorg

I am also still waiting for what should be this years release of NRAM by nantero and LSI Logic.

So there are and were players that were a bit bigger and had more early progress and momentum. Many seem to have run into obstacles and are bogged down.

My money would be on plasmon chips and graphene as the follow on to silicon. They would have similar infrastructure and would be easier technology for the big current players to transition towards.

Current methods are still improving. We will have teraflops in PCs by 2009-2010. Petaflops for PCs by 2020-2023 with relatively ordinary materials.

Something that is going displace current infrastructure would have to come along with 100 times the performance in any year and the ability to improve as fast or faster than Moore's Law to stay ahead. Or conquer a nitch and then expand out ala Flash memory.

So a technology that would want be the next big thing after silicon computers would need to deliver petaflops at PC prices by 2014 and double in performance every year and integrate easily with existing information technology infrastructure and processes. Another way to be successful would be for the new technology to be part of a larger explosion in production capacity so that it could easily build replacement infrastructure. This would be full-blown molecular nanotechnology.

State of the Air 2007 in the United States

The upside is that smog levels declined nationwide between 2003 and 2005, aided by the appearance of more pollution controls on smokestacks, according to the report, called "State of the Air: 2007." The bad news is that the number of places in the United States reporting unhealthy levels of soot grew over the same period, the report found. Soot describes the tiny particles of pollution generated by burning fossil fuels. Soot pollution can increase hospital visits for heart and asthma problems, the American Lung Association said.

H2CAR is unaffordable

Engineer-Poet analyzes the H2CAR proposal and shows why it is a misleading diversion from real solutions H2CAR is completely unaffordable.

In the H2CAR paper, figures such as 239 billion kg/year of hydrogen from 58,000 km2 of solar PV panels are tossed off rather casually. These numbers bear deeper analysis than they receive. For instance, 58,000 km2 of panels could be made by assembling an array of about 46 billion BP SX 170B PV panels (at roughly 1.26 m² each). At a future cost of $2/Wpeak, this array would cost about $15.7 trillion; today's cost would be closer to $40 trillion. Clearly we're not going to do this.

Another example of the disconnect between the researchers and reality is their proposed quantity and method of hydrogen production. Their most optimistic (smallest) quantity of hydrogen required is 239 billion kg/year, which they propose to produce from renewable electricity via electrolysis. The quantity of electricity required (at 100% efficiency, no less) is a staggering 9810 billion kWh/year2; this is nearly 2.5 times current annual US electric production. (Worse than that, it's roughly 6-10 times what it would take to power all ground transport directly with electricity3.) Even if produced from nuclear energy by a thermochemical process of 50% efficiency, this rate of hydrogen production would require nuclear plants equivalent to more than 8 times today's capacity




Here was my original article on H2CAR

130 attosecond light pulse blazes new paths for science, industry

Researchers in Italy have created an ultrashort light pulse—a single isolated burst of extreme-ultraviolet light that lasts for only 130 attoseconds (billionths of a billionth of a second). Their achievement currently represents the shortest artificial light pulse that has been reported in a refereed journal. Shining this ultrashort light pulse on atoms and molecules can reveal new details of their inner workings—providing benefits to fundamental science as well as potential industrial applications such as better controlling chemical reactions. this can lead to better understanding of chemistry and molecules which could lead to molecular nanotechnology.

Follow up on rapid-fire Z-machine Fusion

Pessimistic timetables for commercializing the rapid fire are 30 years or more. However, one of the skeptics is Ian Hutchinson who is a strong supporter of the Tokomak ITER (International Thermonuclear Experimental Reactor) project.

Even if money were no object, it could take 30 years to build a system, says Keith Matzen, director of pulsed power projects at Sandia.

Others think the engineering challenges involved in harnessing rapid series of large explosions are likely to prove just too difficult. With the new device, says Ian Hutchinson, professor of nuclear science and engineering at MIT, the Sandia and Tomsk researchers have scaled a 500-foot hill. The work they've yet to do is the equivalent of a 25,000-foot mountain. Several other researchers concur, noting that the Sandia researchers must also demonstrate that the system can produce the levels of fusion that their models predict.

The Sandia device stores energy in a group of large capacitors and releases it very quickly, in just 100 nanoseconds. A new kind of physical arrangement of these capacitors prevents magnetic fields from forming and slowing electrical current, a major problem with previous devices. But while acknowledging that the technology is an important advance for delivering pulses of power, several experts say a power plant based on such technology faces significant hurdles, not the least of which is building the plant sturdy enough to withstand the strong explosions going off every 10 seconds.


One thing to note is building the plant sturdy enough to withstand the explosions is less of a problem when using the system for pulsed nuclear rockets.

Whether it is 10, 20 or 30 years or more until fusion is scaled up we should press on with scaling up nuclear fission including thorium reactors and solar and wind power

The New Scientist magazine online also has an article on this

The LTDs were developed by researchers at the Institute of High Current Electronics in Tomsk, Russia, in collaboration with Sandia colleagues. Each "spark plug" is about the size of a shoebox and contains a switch coupled to several large capacitors. A circular ring of 20 such units, wired in parallel, can produce half a million amps and one hundred thousand volts. Linking several rings together increases the final voltage produced. Researchers estimate that about 60 rings should be enough to power a fusion reactor.


Here is a link to my original article on the rapid fire z-pinch fusion system

Samsung First to Mass Produce 16Gb NAND Flash Memory

Samsung Electronics announced today that it has become the first to begin mass producing 16 gigabit NAND flash, the highest capacity memory chip now available. The company said it will fabricate the devices in 51 nanometers, the finest process technology to be used in memory mass production to date. Samsung’s 51nm NAND flash chips can be produced 60 percent more efficiently than those produced with 60nm process technology. Samsung achieved this new migration milestone just eight months after announcing production of its 60nm 8Gb NAND flash last August.

The new 16Gb chip which has a multi-level cell (MLC) structure can facilitate capacity expansion by offering 16 gigabytes (GBs) of memory in a single memory card. Furthermore, by applying the new process technology, Samsung has accelerated the chip’s read and write speeds by approximately 80 percent over current MLC data processing speeds.

Further reading:
Samsung has plans for terabit flash memory

Samsung continues to double the density of flash memory each year. This 12 month doubling rate is faster than Moore's law

Samsung has been doubling the density every 12 months since 2002

If they keep doubling then the memory cards go as follows:
2007 128 GB card
2008 256 GB card
2009 512 GB card
2010 1 TB card
2011 2 TB card
2012 4 TB card
2013 8 TB card

Cheap Brain-wave activity sensors

Inexpensive, brain wave activity sensors, are being used for video game controllers


NeuroSky worker Cynthia Lee wears one of their head sets at NeuroSky headquarters in San Jose, Calif., Tuesday, March 27, 2007. The startup company aims to add more realistic elements to video games by using brain wave-reading technology to help game developers make gaming more realistic. (AP Photo/Paul Sakuma)

NeuroSky's prototype measures a person's baseline brain-wave activity, including signals that relate to concentration, relaxation and anxiety. The technology ranks performance in each category on a scale of 1 to 100, and the numbers change as a person thinks about relaxing images, focuses intently, or gets kicked, interrupted or otherwise distracted.


Researchers at NeuroSky and other startups are also building prototypes of toys that use electromyography (EMG), which records twitches and other muscular movements, and electrooculography (EOG), which measures changes in the retina.

While NeuroSky's headset has one electrode, Emotiv Systems Inc. has developed a gel-free headset with 18 sensors.


Besides monitoring basic changes in mood and focus, Emotiv's bulkier headset detects brain waves indicating smiles, blinks, laughter, even conscious thoughts and unconscious emotions. Players could kick or punch their video game opponent - without a joystick or mouse.

"It fulfills the fantasy of telekinesis," said Tan Le, co-founder and president of San Francisco-based Emotiv.


At Emotiv, they are creating
a robust system and methodology for detecting and classifying both human conscious thoughts and non-conscious emotions. This revolutionary patent pending neural processing technology makes it possible for computers to interact directly with the human brain. By the detection of thoughts and feelings, our technology now makes it possible for applications to be controlled and influenced by the user's mind.


CyberLearning is already selling the SmartBrain Technologies system for the original PlayStation, PS2 and original Xbox, and it will soon work with the PlayStation 3 and Xbox 360. The EEG- and EMG-based biofeedback system costs about $600, not including the game console or video games.

The Global Boom in coal

The world is building a lot of coal-fired power to go with all of the coal-fired power it already has. This is horrible because air pollution from coal kills more than one million people every year. We need to use all other means to reduce the increase in coal usage, this means more nuclear as well as more renewables and more conservation.

Nations will add enough coal-fired capacity in the next five years to create an extra 1.2 billion tons of CO2 per year. China accounted for two-thirds of the more than 560 coal-fired power units built in 26 nations between 2002 and 2006. More than over 2 coal-fired power units each week between 2002 and 2006. The Chinese plants boosted annual world CO2 emissions by 740 million tons.

Germany, one of the renewable energy stars for subsidizing renewable energy by about 45 EU cents per kWh, is building 20 coal plants.

The United States is accelerating its buildup dramatically. In the past five years it built 2.7 gigawatts of new coal-fired generating capacity. But in the next five years, it is slated to add 37.7 gigawatts of capacity, enough to produce 247.8 million tons of CO2 per year, according to Platts. That would vault the US to second place –just ahead of India – in adding new capacity.

Even nations that have pledged to reduce global warming under the Kyoto treat are slated to accelerate their buildup of coal-fired plants. For example, eight EU nations – Germany, Italy, Poland, Spain, Bulgaria, Hungary, Slovakia, and the Czech Republic – plan to add nearly 13 gigawatts of new coal-fired capacity by 2012. That's up from about 2.5 gigawatts over the past five years.

The world faces the prospect five years from now of having 7,474 coal-fired power plants in 79 countries pumping out 9 billion tons of CO2 emissions annually – out of 31 billion tons from all sources in 2012.

The US is planning to build more than 150 coal-fired power plants that don't sequester their emissions, according to the US Department of Energy



SOURCE: Platts/RICH CLABAUGH – STAFF

The most recent issue of Business week discusses the USA coal industry Peabody Energy Corp. (BTU ) Gregory H. Boyce, chief executive of the world's biggest coal company, is gambling that the threat of higher electric bills and brownouts will be enough to halt crippling federal regulation.

In a 2006 study, Sanford C. Bernstein & Co. (AB ) analyst Hugh Wynne calculated that even with a relatively steep price for carbon emissions—say, $27 per ton, more than the price in Europe—coal-fired generation still beats gas by 30%. That suggests operators of new coal plants could buy all the carbon credits they need without resorting to costly CO2 capture technologies.