Pages

December 12, 2008

Dense Plasma Fusion: Lawrenceville Plasma Physics Gets Funding



On Nov. 14, 2008, Lawrenceville Plasma Physics(LPP) initiated a two-year-long experimental project to test the scientific feasibility of Focus Fusion. The funding is sufficient to pay for experiments that would ideally prove the viability of this approach.

The goals of the experiment are :
1. To confirm the achievement of the high X-ray energies first observed in previous experiments at Texas A&M;
2. To greatly increase the efficiency of energy transfer into the plasmoid; third, to achieve the high magnetic fields needed for the quantum magnetic field effect;
3. To use pB11 fuel to demonstrate greater fusion energy production than energy fed into the plasma (positive net energy production).

After a 7-year hiatus in our experimental work, we will begin producing critical and exciting data in 2009.

The initiation of the project was made possible by LPP's receiving $620,000 in new investments. Additional investments are forthcoming which will assure continued operations





Focus fusion in Discover Magazine June 2008 (item #2).

It may sound too good to be true, but the technology, called focus fusion, is based on real physics experiments. Focus fusion is initiated when a pulse of electricity is discharged through a hydrogen-boron gas across two nesting cylindrical electrodes, transforming the gas into a thin sheath of hot, electrically conducting plasma. This sheath travels to the end of the inner electrode, where the magnetic fields produced by the currents pinch and twist the plasma into a tiny, dense ball. As the magnetic fields start to decay, they cause a beam of electrons to flow in one direction and a beam of positive ions (atoms that have lost electrons) to flow in the opposite direction. The electron beam heats the plasma ball, igniting fusion reactions between the hydrogen and boron; these reactions pump more heat and charged particles into the plasma. The energy in the ion beam can be directly converted to electricity—no need for conventional turbines and generators. Part of this electricity powers the next pulse, and the rest is net output.

A focus fusion reactor could be built for just $300,000, says Lerner, president of Lawrenceville Plasma Physics in New Jersey. But huge technical hurdles remain. These include increasing the density of the plasma so the fusion reaction will be more intense. (Conventional fusion experiments do not come close to the temperatures and densities needed for efficient hydrogen-boron fusion.) Still, the payoff could be huge: While mainstream fusion research programs are still decades from fruition, Lerner claims he requires just $750,000 in funding and two years of work to prove his process generates more energy than it consumes. “The next experiment is aimed at achieving higher density, higher magnetic field, and higher efficiency,” he says. “We believe it will succeed.”


[emails from a reader who has been following Focus fusion closely]
the power would be at about 0.2¢/kwh, not 1/20¢ (0.05¢). The generators would be from 5-20MW, depending on pulse rate (330 - 1320/sec.) The energy "profit" is actually from harvesting as current (via thousands of foil layers in the containment shell) the ~40% of output which occurs as X-rays. The alpha-beam pulse goes back into the capacitor bank to fire the next "shot", and the electron beam reheats the plasma.

[The latest]LPP engineering analysis indicates that 5 MW focus fusion reactors could be produced for about $300,000 apiece. This is less than one-tenth of the cost of conventional electricity generation units of any style or fuel design. This means that once the prototype is successfully developed within five years, focus fusion generators will be the preferred technology for new electrical distributed generation.

More powerful units can be designed by accelerating the pulse repetition rate, although there are limitations due to the amount of waste heat that can be removed from such a small device. It is likely that units larger than 20MW will be formed by simply stacking smaller units together, with approximately the same cost per kW of generated power.


Current technical information

Eric Lerner, Lawrenceville Plasma Physics, Google Talk 64 minutes


FURTHER INFORMATION
Previous article on focus fusion funding which has now been corrected

Another follow up on Focus fusion

The Focus Fusion google tech talk

Energy Technology News Roundup; New US nuclear reactor on track for 2012, ORNL zero energy house

1. The process for getting Unit 2 of the Watts Bar nuclear plant up and running by 2012 is on track.

TVA is still holding to its August 2007 estimate of $2.49 billion for total capital costs to finish the project. Construction, including work on the high- and low-pressure turbines, already is under way and all major engineering should be finished in the next six to eight months, Bajestani said. The TVA time line calls for Unit 2 to be functional by August 2011, ready for its nuclear fuel load by April 2012, and in full operation by October 2012.






Pending nuclear plant license applications for the US. In the pipeline for the USA:




2. Zero energy housing project update

The savings are made possible by a combination of affordable technologies. Rooftop solar panels generate electricity that can be transferred back to TVA's electric grid, occasionally making the electric meter actually spin backwards. Other special energy features include a solar water heater, a foundation geothermal heat pump installed in the excavated space as the house is being built, highly efficient appliances with Energy Star ratings, compact fluorescent lights, windows facing south toward the sun and a variety of insulation technologies inside and outside walls to keep warm air in during winter and hot air out in summer.

Despite the simplicity of operation, the houses are different from conventional houses in several ways. House 5 has a utility wall that takes advantage of appliances that release heat—such as a refrigerator and freezer—by locating them next to those that use heat to raise the temperature of air or water, such as a dryer and dishwasher. House 5 has a well-insulated basement with concrete blocks that provide thermal mass to enhance occupant comfort because the heat-storing blocks are insulated on the outside by a fiberglass drainage board and exterior finish system. Above-grade walls are 6-in.-thick structural insulated panels, which are slightly thicker than the typical 2 by 4 in. wall system of a conventional house.

Thanks to an increase in funding, Christian is optimistic that five new prototype houses will be built near ORNL by the end of the year.


3. Oak Ridge National Laboratory (ORNL) has released Combined Heat and Power: Effective Energy Solutions for a Sustainable Future, a new report highlighting Combined Heat and Power (CHP) as a realistic solution to enhance national energy efficiency, ensure environmental quality, promote economic growth, and foster a robust energy infrastructure.

The report asks "What if 20% of generating capacity came from CHP?" [It is currently 9% of US capacity) If the United States attained this goal by 2030, benefits would include:
-Using CHP today, the United States already avoids more than 1.9 quadrillion Btu of fuel consumption and annual CO2 emissions equivalent to removing more than 45 million cars from the road.
-A 60% reduction of the projected increase in carbon dioxide (CO2) emissions by 2030—the equivalent of removing 154 million cars from the road
-Fuel savings of 5.3 quadrillion British thermal units (Btu) annually—the equivalent of nearly half the total energy currently consumed by US households
-Economically viable application throughout the nation in large and small industrial facilities, commercial buildings, multi-family and single-family housing, institutional facilities, and campuses
-The creation of 1 million new highly-skilled, competitive "green-collar" jobs through 2030 and $234 billion in new investments throughout the United States.

December 11, 2008

Wearable Superworkstation with Headmounted Display for 2009



MNB technologies has an FPGA mini-board that runs Linux called turboRTAG. This would fit in the desired form factor but would need to be upgraded with a more powerful configuration.

MNB Technologies Inc., is in the process of developing a wearable superworkstation with the capacity of 12 Gflops (six dual core processors) in a gadget the size of a notebook. The fully functional prototype is scheduled for delivery in March 2009 and production quantities will be available in June 2009. Pricing will be announced in February 2009.

The company announced that the Department of Defense has awarded them an $85,000 grant to further its development of the wearable paper back book sized supercomputer. The wireless computer is strapped to a soldier’s belt and is viewed through a high-definition, head-mounted monitor that looks like a tennis visor.

“We have a head-mounted display, HMD, when wearing it gives you the same feeling as if you are sitting six feet away from a 54-inch monitor,” said Nick Granny of MNB Technologies.

The supercomputer device is not restricted to use in military operations. The device has the potential to come in handy in industries such as agriculture, air traffic control and disaster management. It can be used wherever portable high-performance computing has the potential to increase quality and efficiency.


The new computers also have WiFi (802.11g) capabilities and hardwired Gigabit Ethernet, 1GB of main memory and up to 140GB of disk capacity supporting their 1.6GHz X86 architecture processors. The system performance is advanced into the 12GFLOP range with an embedded FPGA-based accelerator giving each of the wearable computers the raw performance of roughly six dual-core personal computers.



MNB Technologies has a lot of experience with FPGA systems.

FUTURE POSSIBILITIES
Darpa is also funding efforts to harvest energy from heat and motion from the body. Efficient devices could always be on using 9-20W and possibly more.



HP memristors can greatly reduce the power requirements and increase the component density and power of FPGAs and memory, which could greatly enhance the capabilities of any FPGA based devices.

Near Term Limits of Mobile Computing Power
Extreme computing analysis of zettaflop supercomputers and corresponding limits for 5 Watt mobile computers



Wearable Displays
Contact lens and existing heads up displays


FURTHER READING
MNB works with pico Computing which packs FPGAs into small form factors

Accord Solutions, another partner, makes Reconfigurable Architecture for Software Protection (RASP)

-RASP contains integrated authentication, validation and key management technology. This is tightly controlled by a local knowledgeable administrator.
-RASP can be delivered in tamper-proof packaging, and is readily integrated into ‘System on a Chip.’
-RASP includes tamper-detection technology which instantly scrubs keys and memory.

SRC Computers makes systems gthat reduce the number of processors, physical size and power consumption by orders of magnitude compared to microprocessor-based systems.

The SRC MAPstation™ puts the performance of traditional large multi-processor server racks into a single desktop unit.



Blacklight Power Signs First Commercial Deal with Estacado Energy

BlackLight Power (BLP) Inc. today announced its first commercial license agreement with Estacado Energy Services, Inc. in New Mexico, a wholly-owned subsidiary of Roosevelt County Electric Cooperative, (Estacado). In a non-exclusive agreement, BLP has licensed Estacado to use the BlackLight Process and certain BLP energy technology for the production of thermal or electric power. Estacado may produce gross thermal power up to a maximum continuous capacity of 250 MW or convert this thermal power to corresponding electricity.

Background
- Blacklight Power has provided information and assistance to a blogger/chemistry professor looking to validate their process

- Venture Beat investigates Blacklight Power

- Rowan University study provides external confirmation of a substantial amount of extra heat from Blacklight Power materials.

- Blacklight Power Claims

The latest expected unit costs for the Blacklight power system compared to current energy technology:



The Blacklight hydrogen production plant diagram

Potential Applications for Blacklight Power Technology
- H2(1/p) Enables laser at wavelengths from visible to soft X-ray
- VUV photolithography (Enables next generation chip)
- Blue Lasers
- Line-of-sight telecom and medical devices
- High voltage metal hydride batteries
- Synthetic thin-film and single crystal diamonds
- Metal hydrides as anticorrosive coatings







Estacado is a wholly-owned subsidiary of Roosevelt County Electric Cooperative, (RCEC) in New Mexico. With over 2,757 miles of energized lines in east central New Mexico, RCEC serves Dora, Elida, Floyd, Arch, Rogers, Milnesand, Causey and Portales.


FURTHER READING
Details of Blacklight Powers patent dispute in the UK.

In upholding both of the examiner's objections, the Hearing Officer identified the question which he had to address to be whether the underlying theory of GUTCQM was true. In doing so, he identified three criteria which he had to consider in determining whether a scientific theory was true, namely whether:

the explanation of the theory is consistent with existing generally accepted theories. If it is not, it should provide a better explanation of physical phenomena then current theories and should be consistent with any accepted theories that it does not displace;

-the theory makes testable predictions, and the experimental evidence shows rival theories to be false and matches the predictions of the new theory, and whether
-the theory is accepted as a valid explanation of physical phenomena by the community of scientists who work in the relevant discipline.

Critically, the hearing officer went on to determine that he must satisfy himself that it was more probable than not that the theory was true. On this basis, the Hearing Officer found that he was not satisfied that the theory was true and therefore the claims in the applications which relied upon the theory were not patentable.

The appeal focused on whether the Hearing Officer had been right in considering the appropriate test to be whether the theory was true on the balance of probabilities. Blacklight contended that the test that should be applied is whether the theory is clearly contrary to well established physical laws. In considering this, the examiner should assess whether the applicant has a reasonable prospect of showing that his theory is a valid one should the patent be litigated in court. In making these arguments, Blacklight accepted that on the material before the Hearing Officer the theory was probably incorrect.


Examiner has an article on Blacklight Power

Energy and transportation developments to Watch in 2009 and a little beyond

The technological and other developments to watch is expanding to four parts:
1. Computers, robots, electronics and communication
2. Energy and transportation - this section
3. DNA/biotech/synthetic biology, nanotechnology
4. Medicine, life extension, space, manufacturing and anything else that was not covered

1. Candidates for nuclear fusion breakthroughs
Interial Electrostatic Confinement (IEC) Fusion (EMC2fusion, inc/Bussard) Still awaiting results from 2008 experiments and funding of next experiments.

Update: Focus Fusion funded for $620,000+ and starting two years of experiments to prove viability. So significant news of some kind likely over the next two years. Dense Plasma Focus fusion

Not expecting to hear anything major for three or more years on the next few:
Tri-alpha energy / colliding beam fusion
General Fusion

2. National Ignition Facility should startup tests
This could lead to the funding and development of a fusion/fission hybrid proposal of Lawrence Livermore labs

3. More environmentally friendly oil recovery processes (THAI/Capri and economically superior method for recovering 2-4 times more oil from existing oilsands (up to 80% of the oil in place with integrated underground upgrading of the oil)

Petrobanks most significant project is at May River, located in the heart of our Whitesands oil sands leases. May River will be built in phases, eventually culminating with 100,000 ("bbl/d") of partially upgraded bitumen. The first phase is planned with production capacity of 10,000 to 15,000 bbl/d. Applications approved Nov 27, 2008. Construction could begin in early 2009 with project start-up in late 2009.


THAI oil process
could replace the current SAGD (steam assisted gravity drainage) and CSS (cyclic steam stimulation) systems used to extract bitumen that is too deep to mine. The process offers high recovery rates -- up to 80 per cent of the oil in place compared with 20 to 50 per cent for SAGD. It also uses little natural gas and water. In THAI, air is pumped under pressure into the toe of the reservoir, creating natural combustion to heat the cold heavy oil, which flows into horizontal pipes.







4. Major energy legislation
Obama energy plan. The US will pass major energy and climate change bill(s).

Thorium bill could be one of the bills passed.

Financial crisis - China, India, USA, Russia, etc... which energy projects continue ?

5. China High Temperature Reactor should start construction in 2009 and China's construction of regular reactors should continue to scale up
Russia breeder reactor continuing to 2012 completion.

6. Blacklight Power is scheduled to being commercial sales of their revolutionary power generators

Blacklight Power success would mean clean power using catalyzed hydrogen from water for about $500/KW. It would be a huge game changer that could also impact the chemistry and physics textbooks.

7. Cars will get more efficient and more electric

-Ecomotors (up to 60% efficient diesel engine) demo engine 2009 and in commercial vehicles 2011

Highway capable electric cars

such as the relatively inexpensive Indica electric car from Tata

Many analysts believe China could adopt electric cars faster than
elsewhere, largely because of its size and comparative lack of reliance on petroleum for transport


China BYD is the first plug in electric car
China BYD is scheduled to sell plug in electric hybrid to Israel in 2009 and Europe in 2010
An electric car overview is here

8. Does the Eestor ultracapacitor get launched and does Zenn launch a car using it in 2009 or not ?

There are many others working on revolutionary ultracapacitors

9. The electric wheel has been targeted to be in thousands of cars in 2010 and in the Venturi Volage car by 2012.

Three versions will be available in 2010 with three different sizes of Lithium-Ion battery module configurations, offering ranges of 150, 300 and 400 km (93, 186 and 248 miles). Drivers will have the option of changing from one module size to another in the same vehicle depending on their needs. Just like hybrids, the Active Wheels recover energy during braking to extend vehicle range. The in-wheel motors are reported to be 90% efficient, compared to about 15% efficiency for a conventional vehicle in city driving.

Test versions of the WILL are on the road now with production scheduled to start in 2010 with a first year output target of several thousand vehicles. The target price of 20 to 25,000 euros (USD$27 – 34,000) puts the Will in the affordable electric vehicle class, along with the much anticipated Chevy Volt. If you are willing to wait a bit longer, and spend a bit more, look for Active Wheels on the Venturi Volage in 2012.


10. Thermoelectrics and low temperature waste heat recovery could have major announcements and impacts to go along with expected continued progress.

11. Next gen biofuels -seaweed, algae, e-coli, jatropha and who carries forward China, Japan, USA, Brazil, India, others ?

12. Superconducting motors and superconducting wind turbines are slated for 2010 and increasing use of superconductors for parts of the grid.

Development of 1MW HTS Motor for Industrial Application Korea Electrotechnology Research Institute and Doosan Heavy Industries & Construction Co.

General Electric has successfully developed and tested a high speed, multimegawatt superconducting generator. The generator was built to demonstrate high temperature superconducting (HTS) generator technology for application in a high power density Multimegawatt Electric Power System (MEPS) for the Air Force. The demonstration
tested the generator under load conditions up to 1.3 MW at over 10,000 rpm. The new MEPS generator achieved 97% efficiency including cryocooler losses.

Multimegawatt Electric Power System (MEPS) MEPS would provide a substantial boost in power capability aboard aircraft while cutting generator weight by 1000 lbs and reducing thermal load. It is based on cryogenic cooling and high RPM generator technology, and has application for directed energy weapons and Naval vessel distributed power. It offers 4-8x the kW/lb of existing or developmental aircraft power systems. The weight savings and reduced thermal load of power onboard power generation will allow for more energy efficient flight and enable more accurate, more powerful weaponry.

US military is trying to get off of its own dependence on oil and wants to get planes and other vehicles more efficient or using other power entirely. The US military is also very concerned about upgrading the US power grid

December 10, 2008

Steven Chu named Energy Secretary


Steven Chu, nobel prize winning physicist, has been named Energy Secretary Steven Chu is pro-nuclear and has a deep understanding of all the technical issues around energy. This is a great choice. It is definitely an example of real change from the previous Bush administration in selecting extreme competence. It is not in any way a guarantee of correct energy choices because there is still political reality and actually enacting legislation and policy based upon the actual facts that are known.

Steven Chu in wikipedia

Steven Chu (Chinese: 朱棣文; pinyin: Zhū Dìwén) (b. February 28, 1948, St. Louis, Missouri) is an American experimental physicist and according to MSNBC and other media outlets, President-elect Barack Obama's choice for Secretary of Energy. He is known for his research in laser cooling and trapping of atoms, which won him the Nobel Prize in Physics in 1997. His current research is concerned primarily with the study of biological systems at the single molecule level. He is currently Professor of Physics and Molecular and Cellular Biology of University of California, Berkeley and the director of the Lawrence Berkeley National Laboratory. As global warming warnings grow more dire, Chu is currently pushing his scientists at Lawrence Berkeley National Laboratory and industry to develop technologies to reduce the impact of climate change by reducing greenhouse gas emissions. Chief in Chu's campaign is an unprecedented research pact reached between UC Berkeley, oil industry giant BP, the Lawrence Berkeley Lab and the University of Illinois.

Nearly US$400 million in new lab space will expand energy-related molecular work centered at Lawrence Berkeley that involves partners around the world; a US$160 million Energy Biosciences Institute (scheduled to open in 2010) and funded by BP will include Chu's separate solar-energy program.


Steven Chu's Publicly Stated Energy Opinions
Steven Chu has made public pro-nuclear and anti-coal speeches.

"Nuclear has to be a necessary part of the portfolio," Chu, the director of the Lawrence Berkeley National Lab, said during the annual economic summit organized by Stanford University.

"The fear of radiation shouldn't even enter into this, he said. "Coal is very, very bad."


Steve Chu's talk on climate change in 2007 is here

Steven Chu signed (Aug 2008) a nuclear energy position document along with the other directors of the national labs. The position was : A coherent long term nuclear power strategy is needed and nuclear power is a major and essential part of solving our energy problems.

-maximize current reactors (plant life extensions, uprate)
-deploy advanced light water reactors
-license Yucca mountain and research advanced fuel management
-aggressive R&D on advanced reactors

From another talk [reference to some kind of nuclear deep burn]:

Suppose instead that we can reduce the lifetime of the radioactive waste by a factor of 1,000. So it goes from a couple-hundred-thousand-year problem to a thousand-year problem. At a thousand years, even though that's still a long time, it's in the realm that we can monitor - we don't need Yucca Mountain.

And all of a sudden the risk-benefit equation looks pretty good for nuclear.


3MB powerpoint presentation from 2005 by Steve Chu

slide 15 and 16 show peak oil. "world production predicted to peak in 10-40 years from 2004."

"energy conservation can lengthen time by a factor of about 2 but the fundamental problem remains"

In 2007, 2008. Helios project replacing oil with solar and advanced biofuels.





Carol Browner as energy "czar" reporting to the president.

Chu was also involved with biofuels: Researchers such as Caltech's Simon have been analyzing microbes extracted from the termite's digestive system, looking for the enzymes that enable the bugs to turn wood cellulose into sugars.

FURTHER READING
Helios, light to fuel 10% efficiency by about 2017, project at berkeley national labs

NY Times profile

Commenting on Mark Jacobsons Energy Source Rankings

Mark Jacobson, a professor of civil and environmental engineering at Stanford, has written a review of solutions to global warming, air pollution, and energy security which examines all energy sources. The Jacobson analysis is very biased in favor of wind and against nuclear and biofuels. Nuclear power is burdened with CO2 from "burning cities in the event of a nuclear exchange". Nuclear weapons exist now and building more nuclear power will not increase those risks. Nuclear weapons material were made from special reactors not for generating power or from special enrichment facilities. There can be burning cities from conventional weapons -see world war II bombing of Tokyo, Dresden etc...

I would also propose that if this analysis is valid that we add deaths and CO2 (both from explosions and from the resulting fires) from bomber aircraft to an analysis of passenger jets and add deaths from bullets to the lead industry and deaths from chemical explosives to chemical fuel cells. An analysis of how the Jacobson paper is dishonest and biased.

US would $2+ trillion over decades to upgrade the energy grid.

A presentation discussing the costs and reserve requirements for higher percentage wind

EWEA (European wind Association, obviously pro-wind) presented a plan to get to 30% wind power by building a Europe/Africa wide grid to connect wind farms. They think it will take decades to scale to that level.

Jacobson claims that an all battery-powered U.S. vehicle fleet could be charged by 73,000 to 144,000 5-megawatt wind turbines.


This is 365 to 720 nameplate Gigawatts with 20-40% capacity factor. About 500-800 TWh/year if it was scaling the current 94 GW of world wind power.



So supplying power to an all battery-powered US vehicle fleet could be done by 20-30% of existing world nuclear power instead of 550-850% of current wind power.

Nuclear power plant emissions include those due to uranium mining, enrichment, and transport and waste disposal as well as those due to construction, operation, and decommissioning of the reactors. We estimate the lifecycle emissions of new nuclear power plants as 9–70 g CO2e kWh−1, with the lower number from an industry estimate and the upper number slightly above the average of 66 g CO2e kWh−150 from a review of 103 new and old lifecycle studies of nuclear energy. Three additional studies estimate mean lifecycle emissions of nuclear reactors as 59, 16–55, and 40 g CO2e kWh−1, respectively; thus, the range appears within reason.


Wind has the lowest lifecycle CO2e among the technologies considered. For the analysis, we assume that the mean annual wind speed at hub height of future turbines ranges from 7–8.5 m s−1. Wind speeds 7 m s−1 or higher are needed for the direct cost of wind to be competitive over land with that of other new electric power sources.33 About 13% of land outside of Antarctica has such wind speeds at 80 m (Table 2), and the average wind speed over land at 80 m worldwide in locations where the mean wind speed is 7 m s−1 or higher is 8.4 m s−1.23 The capacity factor of a 5 MW turbine with a 126 m diameter rotor in 7–8.5 m s−1 wind speeds is 0.294–0.425 (ESI), which encompasses the measured capacity factors, 0.33–0.35, of all wind farms installed in the US between 2004–2007.26 As such, this wind speed range is the relevant range for considering the large-scale deployment of wind. The energy required to manufacture, install, operate, and scrap a 600 kW wind turbine has been calculated to be 4.3 × 106 kWh per installed MW.37 For a 5 MW turbine operating over a lifetime of 30 yr under the wind-speed conditions given, and assuming carbon emissions based on that of the average US electrical grid, the resulting emissions from the turbine are 2.8–7.4 g CO2e kWh−1 and the energy payback time is 1.6 months (at 8.5 m s−1) to 4.3 months (at 7 m s−1). Even under a 20 yr lifetime, the emissions are 4.2–11.1 g CO2e kWh−1, lower than those of all other energy sources considered here. Given that many turbines from the 1970s still operate today, a 30 yr lifetime is more realistic.


So the nuclear industry CO2 figure is in the middle of the lifetime wind CO2 emissions figure.

New laser enrichment methods would reduce the energy used by 3-20 times which would could reduce the energy and thus the CO2 emitted by nuclear power.

Jacobson has some penalty for delays and war/terrorism. This is total BS as people can and do make nuclear weapons without leveraging the enrichment or facilities for nuclear power. This part of his analysis is junk.





For nuclear energy, we add, in the high case, the potential death rate due to a nuclear exchange, as described in Section 4d, which could kill up to 16.7 million people. Dividing this number by 30 yr and the ratio of the US to world population today (302 million : 6.602 billion) gives an upper limit to deaths scaled to US population of 25500 yr−1 attributable to nuclear energy. We do not add deaths to the low estimate, since we assume the low probability of a nuclear exchange is zero.


As noted nuclear power generation is not connected with nuclear weapons. Countries have almost always gotten nuclear weapons first and then gotten nuclear power. Plus the US has thousands of nuclear weapons now and if nuclear power is increased ten times or one thousand times that is independent of those nuclear weapons.

Proposed: Interconnecting geographically-dispersed intermittent energy sources
Interconnecting geographically-disperse wind, solar, tidal, or wave farms to a common transmission grid smoothes out electricity supply significantly, as demonstrated for wind in early work.105 For wind, interconnection over regions as small as a few hundred kilometers apart can eliminate hours of zero power, accumulated over all wind farms and can convert a Rayleigh wind speed frequency distribution into a narrower Gaussian distribution.106 When 13–19 geographically-disperse wind sites in the Midwest, over a region 850 km × 850 km, were hypothetically interconnected, an average of 33% and a maximum of 47% of yearly-averaged wind power was calculated to be usable as baseload electric power at the same reliability as a coal-fired power plant.107 That study also found that interconnecting 19 wind farms through the transmission grid allowed the long-distance portion of capacity to be reduced, for example, by 20% with only a 1.6% loss in energy. With one wind farm, on the other hand, a 20% reduction in long-distance transmission caused a 9.8% loss in electric power. The benefit of interconnecting wind farms can be seen further from real-time minute-by-minute combined output from 81% of Spain's wind farms. Such figures show that interconnecting nearly eliminates intermittency on times scales of hours and less, smoothing out the electricity supply. In sum, to improve the efficiency of intermittent electric power sources, an organized and interconnected transmission system is needed. Ideally, fast wind sites would be identified in advance and the farms would be developed simultaneously with an updated interconnected transmission system. The same concept applies to other intermittent electric power sources, such as solar PV and CSP. Because improving the grid requires time and expense, planning for it should be done carefully.


Rebuilding and upgrading the grid to allow for this is a long term and expensive process. The Jacobson report does not count this against wind or solar. The various approaches to addressing intermittency would not work that well and it assumes that there is enough wind and renewables deployed to make it work. ie Spend a few trillion to build it and it will work.

The land area calculations are also biased.

December 09, 2008

Proposed Laser ignition Fusion/Fission Hybrid Commercial Power by 2030













LIFE, an acronym for Laser Inertial Fusion-Fission Energy, is an advanced energy concept under development at Lawrence Livermore National Laboratory (LLNL).
Conceptual design for a LIFE engine and power plant based on National Ignition Facility (NIF)-like fusion targets and a NIF-like laser operating at an energy of 1.4 megajoules (MJ) at a wavelength of 350 nanometers (ultraviolet), with a 2.5-meter-radius target chamber and with the final optics at a distance of 25 meters from the target. The National Ignition Campaign will begin during 2009, and ignition and fusion energy yields of 10 to 15 megajoules (MJ) are anticipated during fiscal years 2010 or 2011. Fusion yields of 20 to 35 MJ are expected soon thereafter. Ultimately fusion yields of 100 MJ are expected on NIF. The LIFE system is designed to operate with fusion energy gains of about 25 to 30 and fusion yields of about 35 to 50 MJ to provide about 500 megawatts (MW) of fusion power – about 80 percent of which comes in the form of 14.1 million electron-volt (MeV) neutrons with the rest of the energy in X-rays and ions. This is an approach which would be as good as and in some ways superior to liquid flouride thorium reactors. Improvements in lasers and cost reduction with laser components would meet the requirements of this project if current trends continue. A success with aneutronic nuclear fusion such as might occur with Bussard Inertial electrostatic fusion, dense plasma focus fusion would likely be superior to this. It would be worthwhile to fund several of these vastly superior approaches to nuclear fission and fusion for a billion or few billion each in order to get many multiple trillions of payoff with a homerun energy success. Even partial success with one of these approaches could deal with all of the current nuclear waste (unburned fuel) which would cost tens of billions to store in a place like Yucca Mountain.

This approach to fusion generates approximately 10**19 14.1-MeV neutrons per shot (about 10**20 neutrons every second). When used to drive a subcritical fission "blanket," the fusion neutrons generate an additional energy gain of four to ten depending upon the details of the fission blanket, providing overall LIFE system energy gains of 100 to 300. (EROI 100-300)

The fission blanket contains either 40 metric tons (MT) of depleted uranium; un-reprocessed spent nuclear fuel (SNF); natural uranium or natural thorium; or a few MT of the plutonium-239, the minor actinides such as neptunium and americium, and the fission products separated from reprocessed SNF.

With the appropriate research, development and engineering program, LIFE engines could begin to provide electricity to U.S. consumers within 20 years, and could provide a very significant fraction of U.S. and international electricity demand by 2100.



Nuclear Roundup: India's Plans, Japan Steel Will Triple Forging Capacity,

1. Nuclear Power Corporation of India Ltd (NPCIL) will start site work next year for 12 indigenously-developed reactors, including eight pressurised heavy water reactors (PHWRs) of 700 MWe each, three 500 MWe fast breeder reactors (FBRs) and one 300 MWe advanced heavy water reactor (AHWR).

This week he said that "India is now focusing on capacity addition through indigenisation" with progressively higher local content for imported designs, up to 80%.

Looking ahead, NPCIL's augmentation plan includes construction of 25-30 light water reactors of at least 1000 MWe by 2030, and NPCIL is currently identifying coastal sites for the first of these, both 1000 and 1650 MWe types.

Long term, the AEC envisages its fast reactor program being 30 to 40 times bigger than the present PHWR program, which has some 4.4 GWe operating or under construction and 5.6 GWe planned. This will be linked with up to 40 GWe of light water reactor capacity, the used fuel feeding ten times that fast breeder capacity, thus "deriving much larger benefit out of the external acquisition in terms of light water reactors and their associated fuel." This 40 GWe of imported LWR multiplied to 400 GWe via FBR synergy would complement 200-250 GWe based on the indigenous programme of PHWR-FBR-AHWR. Thus, AEC is "talking about 500 to 600 GWe over the next 50 years or so" of nuclear capacity in India, plus export opportunities.


How these reactors relate to Thorium and the nuclear fuel cycle

The long-term goal of India's nuclear program is to develop an advanced heavy-water thorium cycle. This first employs the PHWRs fuelled by natural uranium, and light water reactors, to produce plutonium.

Stage 2 uses fast neutron reactors burning the plutonium to breed U-233 from thorium. The blanket around the core will have uranium as well as thorium, so that further plutonium (ideally high-fissile Pu) is produced as well as the U-233.

Then in stage 3, Advanced Heavy Water Reactors (AHWRs) burn the U-233 and this plutonium with thorium, getting about two thirds of their power from the thorium.

In 2002 the regulatory authority issued approval to start construction of a 500 MW prototype fast breeder reactor at Kalpakkam and this is now under construction by BHAVINI. The unit is expected to be operating in 2010, fuelled with uranium-plutonium oxide (the reactor-grade Pu being from its existing PHWRs). It will have a blanket with thorium and uranium to breed fissile U-233 and plutonium respectively. This will take India's ambitious thorium program to stage 2, and set the scene for eventual full utilisation of the country's abundant thorium to fuel reactors. Four more such fast reactors have been announced for construction by 2020.


Initial FBRs will be have mixed oxide fuel but these will be followed by metallic-fuelled ones to enable shorter doubling time.

2. Japan steel to triple forging to 12 per year by 2012.






3. My favorite Idaho's : Suggestions for nuclear power policy and plans in the United States

Dan Yurman, Idaho Samizdat, recommends:
The main idea is that the government should set up a revolving loan fund (at last $200 billion) and be the investor of first choice for a nuclear utility. By offering funding at 100% of the cost of the plant, to be repaid over 15 years at a rate equal to a treasury bond, e.g., 4.5%, the government would break even and provide exactly the same benefit to the utility as a loan guarantee. The difference is the government assumes all the risk, not just 80% of it.


Idaho National Labs plan for making the most of current light water nuclear.


Research the details of extending nuclear plants to safe 80 year operation and start building 4-8 per year in 2021 and build the 30 or so in the licensing pipeline. This would quadruple nuclear power in the USA by 2070. Double nuclear power by 2030, Triple it by 2050.

December 08, 2008

Carnival of Space Week 82

Space Disco converted thirty entries into a video carnival



This includes this sites contribution of an article on Winterberg's concept for fusion propulsion and power.




Centauri Dreams wonders if the Earth could survive the sun turning into a red giant.

Check out the carnival at Space Disco for a lot more.

Limits of statistics applied to High Impact Technology and Existencial Risk

Nassim Taleb popularized the theory and wrote "the Black Swan". He has written about the limits of Statistics and in particular limits in the Fourth Quadrant.

Applying his advice to avoid optimization, love redundancy applied to Lifeboat Foundation class threats or beneficial high impact technologies.

Do not try to optimize only one technology project - ie just the Tokomak approach to nuclear fusion, or one path to Artificial General Intelligence (AGI) or molecular nanotechnology. Have multiple projects and approaches. More chances to get lucky.

Do not optimize one strategy for defense of civilization but make the redundant civilization. More planets, space stations, hardened earth sites, etc...

One thing is that the casual observers "Black swan" can be someone else (who investigates and researches an issue) predicted and inevitable event, which they tried to warn the world or country about and actively tried to stop.

Like Warren Buffet saying 5 years ago that credit defaults swaps were weapons of financial mass destruction and getting all of his companies out of them.

Or state mortgage regulators and state attorneys generals trying to stop the overly loose Federal mortgage regulations and standards.

An alternative to statistics is to look closely at many of the data points and do the due diligence on high impact technological innovation and financial and existence risks.

The Map

The traps in misapplying statistics are:

First Quadrant: Simple binary decisions, in Mediocristan: Statistics does wonders. These situations are, unfortunately, more common in academia, laboratories, and games than real life—what I call the "ludic fallacy". In other words, these are the situations in casinos, games, dice, and we tend to study them because we are successful in modeling them.

Second Quadrant: Simple decisions, in Extremistan: some well known problem studied in the literature. Except of course that there are not many simple decisions in Extremistan.

Third Quadrant: Complex decisions in Mediocristan: Statistical methods work surprisingly well.

Fourth Quadrant: Complex decisions in Extremistan: Welcome to the Black Swan domain. Here is where your limits are. Do not base your decisions on statistically based claims. Or, alternatively, try to move your exposure type to make it third-quadrant style ("clipping tails").

Where there are heavy and/or unknown probability tails and no or unknown characteristic scale and complex payoffs then you are in the fourth quadrant.

Complex payoff examples are societal consequence of pandemics and benefits of innovative technology.






Phronetic Rules: What Is Wise To Do (Or Not Do) In The Fourth
First Phronetic Rules: What Is Wise To Do (Or Not Do) In The Fourth
Quadrant

Avoid Optimization, Learn to Love Redundancy Psychologists tell us that getting rich does not bring happiness—if you spend it. But if you hide it under the mattress, you are less vulnerable to a black swan. Only fools (such as Banks) optimize, not realizing that a simple model error can blow through their capital (as it just did). In one day in August 2007, Goldman Sachs experienced 24 x the average daily transaction volume—would 29 times have blown up the system? The only weak point I know of financial markets is their ability to drive people & companies to "efficiency" (to please a stock analyst's earnings target) against risks of extreme events.

Indeed some systems tend to optimize—therefore become more fragile. Electricity grids for example optimize to the point of not coping with unexpected surges—Albert-Lazlo Barabasi warned us of the possibility of a NYC blackout like the one we had in August 2003. Quite prophetic, the fellow. Yet energy supply kept getting more and more efficient since. Commodity prices can double on a short burst in demand (oil, copper, wheat) —we no longer have any slack. Almost everyone who talks about "flat earth" does not realize that it is overoptimized to the point of maximal vulnerability.

Biological systems—those that survived millions of years—include huge redundancies. Just consider why we like sexual encounters (so redundant to do it so often!). Historically populations tended to produced around 4-12 children to get to the historical average of ~2 survivors to adulthood.

Option-theoretic analysis: redundancy is like long an option. You certainly pay for it, but it may be necessary for survival.

December 07, 2008

Intel has 340 GHz Silicon-based Avalanche Photodetector


Intel has a record breaking silicon-based “Avalanche Photodetector” with a gain-bandwidth product of 340GHz. This is the first time a silicon photonic device beats its equivalent made from traditional optoelectronic materials. [From Intel Teleconference Dec 4, 2008 given by Mario Paniccia.]

Avalanche Photodetectors detect light and additionally amplify signals by multiplying electrons. The amplification provided by APDs makes them more sensitive. This sensitivity can be used to reduce power requirements or extend operating distance.

Avalanche photodetectors are found inside bulky optical networking equipment and currently cost $200 to $300 dollars apiece. If these detectors could be made out of silicon, Paniccia says they could cost less than $10. Higher bandwidth at lower cost.

The new detector is not yet ready to appear in products. There is still work to do in reducing dark current. Dark current is stray current that leaks from the device even when it's not absorbing photons. Intel expects commercial silicon photodetectors in the next couple of years. All of the silicon photonic devices can be made in regular Intel fabs.









Intel researchers see integrating intels processors with an optical networking chip in a hybrid package of the two chips initially. Perhaps in 5 years later or so there would be integration of the chips.