February 29, 2008

Multi-stage fracturing of horizontal wells in the Bakken

Packers Plus Energy Services Inc., Calgary-based company, says its unique technology can deliver precisely-controlled fracs along a horizontal wellbore at an affordable price.
"Without our StackFrac system, the Bakken formation in southeastern Saskatchewan would still be uneconomic for the most part," says Dan Themig, president of the seven-year-old private firm.

Producers had already attempted straightforward stimulation of open hole horizontal well bores, sometimes dubbed "Hail Mary" fracs. That tactic sometimes worked but more often the formation would crack mainly at its weakest point rather than along the entire wellbore. Overstimulation at one point not only limits the increase in oil or gas production but sometimes triggers water incursion into the well.

Horizontal well with controlled fractures along the length

An alternative approach was attempted on horizontal wells which had been cased with a production liner and then cemented. The wellbore would be segmented with bridge plugs, then stimulated through perforations in the steel liner. This technique required multiple coiled tubing trips. Fracing each segment of the wellbore involved rig up and rig down of the stimulation equipment. This type of operation may take weeks and generates expenses that frequently prove uneconomic. Although horizontal drilling was well established by the turn of the century, the oilpatch still hungered for a satisfactory horizontal stimulation method - quick, consistently cost-effective and repeatable on a large scale.

A StackFrac operation begins with the insertion of a steel liner into the well. The liner is segmented with tire-shaped rubber seals called packers, capable of sustaining differential pressure ratings of 10,000 psi at 400 degree Fahrenheit. (See the drawing above.) Between each pair of packers are one or more ports. Each port includes two features, an aperture and a specific diameter. The aperture, when
opened, permits frac fluid to flow into the annulus - the space between the liner and rock formation. Also, the internal diameter of each port is smaller than its neighbour, with the smallest diameter at the end of the liner (the importance of this feature will become clear in a moment).

Once the liner is fully in place, frac fluid is pumped through it into the well. After full circulation is achieved, the rubber packers are expanded. They can increase in size by 40% and will conform to hole irregularities like ovalities and washouts. A small ball is then inserted into the frac fluid and is pumped along the liner until it seats itself within the last port. (For example, the ball may be two
inches in diameter, the diameter of the final port a half inch smaller.) As pressure rises against the seated ball, the adjacent port aperture opens and frac fluid flows into that "stage" or segment of the wellbore. The fracturing or rock cracking process is sometimes tracked with microseismic gear to ensure that it's effective.

When the bottom segment of the well has been fraced, the crew will inject a slightly larger ball into the well, which will seal and open the next port. The process will be repeated until the entire wellbore has been stimulated. Acidizing can also be handled through the ports. Well sections not worth stimulating can be passed by. Each stage can be production-tested individually if desired. Frac design can be
tailored to avoid overstressing any section of the wellbore, greatly reducing the risk of water incursion. If water does invade a portion of the wellbore, that stage can be sealed off. The liner is left in place and, if appropriate, can be designed for use in further downhole operations.

To date, the record TVD (true vertical depth) is more than 15,000 feet and the deepest well has been more than 25,000 feet MD (measured depth). Themig is confident that 30,000 feet will be manageable within six months.

Despite mounting successes across the Western Sedimentary Basin as well as the United States, the novel technology didn't make much splash until two years ago. That's when Petrobank began applying it in the Saskatchewan Bakken. Unstimulated, a Bakken horizontal well typically makes 10 to 30 barrels per day, hardly an economic return for an expenditure of $1.2 million. When stimulated using earlier
technologies, however, water cuts routinely jumped from near nothing to 70% of total production. StackFrac enabled Petrobank to stimulate oil flow with minimal additional water, which transformed the Bakken into Canada's hottest oil play.

Powerpoint presentation on the Stackfrac drilling technique

Cased hole StackFrac

Large sheets of Carbon nanotubes produced

Nanocomp Technologies of Concord, is producing sheets of carbon nanotubes that measure three feet by six feet and promising slabs 100 square feet in area as soon as this summer. The first applications will probably be as electrical conductors in planes and satellites to replace copper wire and save weight. Saving weight would save fuel.

UPDATE: As of January 2009,the size of their carbon nanotube sheets was only increased to 4 feet by 8 feet.

Nanocomp’s materials possess a unique combination of high strength-to-weight ratio, electrical and thermal conductivity, as well as flame resistance that exceeds those of many other advanced materials by orders of magnitude. The resulting material can be a valuable addition to such applications such as electromagnetic interference (EMI) shielding, electrical conductors, thermal dissipation solutions, lightning protection and advanced structural composites. Full scale production in 2012 is expected. One of many application of interests to futurists would be superior solar sails A carbon nanotube sail could reach 4% of the speed of light by just making a close flyby slingshot around the sun. Such a perfected carbon nanotube sail would take fibers that were meters in length and not millimeters. However, the progress with large sheets of carbon nanotubes combined with being able to scale up from excellent millimeter gauge strength fibers could get us very close to that kind of performance. A solar sail would not be able to take that much cargo (think small robotic probes), but it would enable a radical improvement over the capabilities of chemical and ion rockets.

18 square foot sheets of carbon nanotubes have been made, with sheets hundreds of square feet in size promised by the summer. Being able to make solar sails using carbon nanotubes with 0.1grams per square meter in weight (weight would include all cargo and gear and structure, if the carbon nanotubes were doped for better conductance) would enable a swing by the sun within 4 solar radii to drive a solar sail up to 4% of the speed of light. Enabling high speed probes. Obviously a lot of work ahead for that goal but encouraging progress none the less.

The baseline solar sail design for an interstellar probe (ISP) mission to the near-interstellar medium assumes an areal density of 1g/m**2 (including film and structure), and a diameter of ~410 m. Missions to the stars will require very large sails with areal densities approaching 0.1 g/m**2.

Current solar sail material

There has been some theoretical speculation about using molecular manufacturing techniques to create advanced, strong, hyper-light sail material, based on nanotube mesh weaves, where the weave "spaces" are less than ½ the wavelength of light impinging on the sail. While such materials have as-of-yet only been produced in laboratory conditions, and the means for manufacturing such material on an industrial scale are not yet available, such materials could weigh less than 0.1 g/m² making them lighter than any current sail material by a factor of at least 30. For comparison, 5 micrometre thick Mylar sail material weighs 7 g/m², aluminized Kapton films weighs up to 12 g/m², and Energy Science Laboratories' new carbon fiber material weighs in at 3g/m².

The tensile strength of the mat ranges from 200 to 500 megapascals—a measure of how tough it is to break. A sheet of aluminum of equivalent thickness, for comparison, has a strength of 500 megapascals. If Nanocomp takes further steps to align the nanotubes, the strength jumps to 1,200 megapascals. The sheets, which the company can produce on its single machine at a rate of one per day, are composed of a series of nanotubes each about a millimeter long, overlapping each other randomly to form a thin mat.

CNT fiber 6 gigapascals and short 1-2 millimeter gauge strength tubes have up to 9 gigapascals of strength.

Team Deltax is among many trying to make high strength macro scale fiber Some are trying to make carbon fibers from forests of fibers on silicon wafer.

There were several announcements from the end of 2007 about breakthroughs in carbon nanotube strength but so far it has been limited to gage length fibers (1-2 millimeters)

Single-walled carbon nanotubes are among the strongest materials known
and exhibit remarkably high stiffness—about 1 terapascal, and 1.2 gigapascal for high-carbon steel.
Theoretically carbon nanotubes can have tensile strengths beyond 120 GPa, in practice the highest tensile strength ever observed in a
single-walled tube is 52 GPa, and such tubes averaged breaking between 30 and 50 GPa. The trouble has been keeping this strength up to macrolength fibers/ropes and sheets.

Kevlar has a tensile strength of 2.6-4.1 GPa
Quartz fiber can reach upwards of 20 GPa.
PBO 5.2-5.8 GPa
Spectra 1000 2.57 Gpa
Carbon fiber 3.5 GPa (tens of thousands of tons of carbon fiber are produced each year)
M5 fiber has been reported at 3-6 GPa of tensile strength but was supposed to reach a conservative 8.5 GPa and a target of 9.5 GPa of tensile strength There have not been many reports since 2004-2005 on the progress of the M5 fiber. It seems they are still working at laboratory quantity and scale.

Los Alamos had talked about making tubes that were several centimeters in length and spinning them into fibers that had 50 GPa and was called Superthread However, there has been no news since 2006 about this development.

Antoinette says Nanocomp’s technical achievement was to figure out a way to maintain the catalyst particle at the desired size and hold it stable long enough for the nanotube to grow to millimeter length. A computer controlling about 30 different parameters in the process—including temperature, temperature gradient, gas flow rates, and the chemistry of the mix—allows the builders to control the properties of the tubes. One setting gives them single-walled tubes, and another gives multi-walled versions, with one cylinder inside another, which provide different properties.

Adding conductive cables made of his nanotubes to the bodies of airplanes would channel the energy from lightning strikes around sensitive electronic equipment without adding much weight. And running electricity through them on the ground could heat them up and de-ice the aircraft.

It’s the light weight of carbon nanotube wires—only about 20 percent of the weight of the same volume of copper wire—that could make them especially attractive for the aerospace industry. “1850s copper wire is still the conductor of all our satellites, all our aircraft,” Antoinette says. If using nanotubes could cut the weight of two tons of copper wire in a 747 in half, he says, “you’re talking literally millions of dollars of savings in fuel costs” over the life of an airplane.

Nanocomp has already been qualified as a vendor by Boeing, Lockheed Martin, and Northrop Grumman. The company is shipping evaluation quantities of its material to them and others for testing in various uses. Once Nanocomp gets to 100-square-foot sheets, the company will decide whether it wants to continue to scale up the size or to build more machines to ramp up production. Antoinette expects to have a pilot plant running by 2010, with full-scale production by 2012.

February 28, 2008

Carnival of Space Week 43

Carnival of Space 43 is up at Starts with a Bang

My contribution was on the newly revealed prototypes from NASA funded projects for a new lunar truck and new robotic mining rover

Music of the Spheres discusses the work at Stone Aerospace on robotic explorers that could lead to a robotic explorer for Europa's icy ocean. There is a video in the article from the TED conference, where Bill Stone, of Stone Aerospace proposes that space-based fuel depots will be the key to opening up space for private enterprise. He plans to lead a robotically supported and privately funded mission by 2015 to access the possible icy water on the moon. I agree that this would be a fantastic plan that would open up and lower the cost of near earth space exploration if successful.

The Endurance is a robotic vehicle which is exploring a Wisconson lake and should be looking at Antartic testing

Amanda Bauer discusses past and future space science She notes that it would have been better to not have fixed the Hubble space telescope but launched several telescopes instead.

Bad Astronomy discusses the panspermia theory

The Panspermia theory is that asteroid impacts could spread single cell life around the planets.

A new paper that just came out in the peer-reviewed journal Astrobiology says that some bacteria could, in fact, survive the initial launch event. Amazingly, the enormous pressure generated in an asteroid impact on the surface of Mars may be survivable, if you’re really really tiny.

The researcher made models of the Martian ground seeded with bacteria, then subjected these samples to the pressures expected in an impact event. Amazingly, many of the bacteria survived. Lichens and bacteriospores did the best, surviving pressures from 5 - 40 billion Pascals, which is about 50,000 to 400,000 atmospheric pressures. That’s a lot. Cyanobacteria were the wussies of the lot, only surviving up to 100,000 atmospheric pressures.

February 27, 2008

Nanoemulsion vaccines could be effective against smallpox and HIV

A novel technique for vaccinating against a variety of infectious diseases – using an oil-based emulsion placed in the nose, rather than needles – has proved able to produce a strong immune response against smallpox and HIV in two new studies.

The results build on previous success in animal studies with a nasal nanoemulsion vaccine for influenza, reported by University of Michigan researchers in 2003.

Nanoemulsion vaccines developed at the Michigan Nanotechnology Institute for Medicine and the Biological Sciences at U-M are based on a mixture of soybean oil, alcohol, water and detergents emulsified into ultra-small particles smaller than 400 nanometers wide, or 1/200th the width of a human hair. These are combined with part or all of the disease-causing microbe to trigger the body’s immune response.

The surface tension of the nanoparticles disrupts membranes and destroys microbes but does not harm most human cells due to their location within body tissues. Nanoemulsion vaccines are highly effective at penetrating the mucous membranes in the nose and initiating strong and protective types of immune response, Baker says. U-M researchers are also exploring nasal nanoemulsion vaccines to protect against bioterrorism agents and hepatitis B.

The smallpox results, which appear in the February issue of Clinical Vaccine Immunology, could lead to an effective human vaccine against smallpox that is safer than the present live-vaccinia virus vaccine because it would use nanoemulsion-killed vaccinia virus, says Baker.

Anna U. Bielinska, Ph.D., a research assistant professor in internal medicine at the U-M Medical School, and others on Baker’s research team developed a killed-vaccinia virus nanoemulsion vaccine which they placed in the noses of mice to trigger an immune response. They found the vaccine produced both mucosal and antibody immunity, as well as Th1 cellular immunity, an important measure of protective immunity.

When the mice were exposed to live vaccinia virus to test the vaccine’s protective effect, all of them survived, while none of the unvaccinated control mice did. The researchers conclude that the nanoemulsion vaccinia vaccine offers protection equal to that of the existing vaccine, without the risk of using a live virus or the need for an inflammatory adjuvant such as alum hydroxide.

In antibody immunity, antibodies bind invading microbes as they circulate through the body. In cellular immunity, the immune system attacks invaders inside infected cells. There is growing interest in vaccines that induce mucosal immunity, in which the immune system stops and kills the invader in mucous membranes before it enters body systems.

A National Institutes of Health program, the Great Lakes Regional Centers of Excellence for Biodefense and Emerging Infectious Diseases, funded the research. If the federal government conducts further studies and finds the nanoemulsion smallpox vaccine effective in people, it could be a safer way to protect citizens and health care workers in the event of a bioterrorism attack involving smallpox, Baker says.

That would allay concerns about the current vaccine’s safety which arose in 2002. On the eve of the Iraq War, the Bush administration proposed a voluntary program to vaccinate military personnel and 500,000 health care workers with the existing vaccine to prepare for the possible use of smallpox virus as a biological weapon.

New Lunar Truck and mining rover

Image at left: This robot shares some features with the lunar truck [below after the jump], but is equipped with a drill designed to find water and oxygen-rich soil on the moon. Credit: Carnegie Mellon University

The moon has one-sixth the gravity of Earth, so a lightweight rover will have a difficult job resisting drilling forces and remaining stable. Lunar soil, known as regolith, is abrasive and compact, so if a drill strikes ice, it likely will have the consistency of concrete. Meeting these challenges in one system requires ingenuity and teamwork. Engineers used this lunar rover to demonstrate a drill capable of digging samples of regolith. The demonstration used a laser light camera to select a site for drilling then commanded the four-wheeled rover to lower the drill and collect three-foot samples of soil and rock.

In 2008, the team plans to equip the rover with ISRU's Regolith and Environment Science and Oxygen and Lunar Volatile Extraction experiment, known as RESOLVE. Led by engineers at NASA's Kennedy Space Center, Fla., the RESOLVE experiment package will add the ability to crush a regolith sample into small, uniform pieces and heat them.

The process will release gases deposited on the moon's surface during billions of years of exposure to the solar wind and bombardment by asteroids and comets. Hydrogen is used to draw oxygen out of iron oxides in the regolith to form water. The water then can be electrolyzed to split it back into pure hydrogen and oxygen, a process tested earlier this year by engineers at NASA's Johnson Space Center in Houston.

"We're taking hardware from two different technology programs within NASA and combining them to demonstrate a capability that might be used on the moon," said Gerald Sanders, manager of the ISRU project. "And even if the exact technologies are not used on the moon, the lessons learned and the relationships formed will influence the next generation of hardware."

NASA is testing many technologies needed for research on the moon. Two examples are a lunar truck for astronauts and a rover equipped with a drill designed to dig into the moon's soil.

Some, all or none of these features may be selected to be in the design of a rover that eventually goes to the moon. NASA's lunar architects currently envision pressurized rovers that would travel in pairs, two astronauts in each rover. The new prototype vehicle is meant to provide ideas as those future designs are developed.

Nasa selects 19 studies for the next space observatories

Physics professor Jacqueline Hewitt, director of MIT's Kavli Institute for Astrophysics and Space Science, stands behind a prototype of a radio telescope array.

NASA has selected a proposal by an MIT-led team to develop plans for an array of radio telescopes on the far side of the moon that would probe the earliest formation of the basic structures of the universe. The agency announced the selection and 18 others related to future observatories on Friday, Feb.15. The present plan is for a one-year study to develop a detailed plan for the telescope array, whose construction would probably not begin until sometime after the year 2025, and is expected to cost more than $1 billion.

Proposals by the 19 teams selected by NASA Feb. 15 will help guide decisions made during the Astronomy and Astrophysics Decadal Survey in 2010, led by the National Academy of Sciences to identify the most promising space observatory proposals. The 2008 NASA awards for the next-generation of astronomy missions ranged from $250,000 to $1 million each.

The New Worlds Observer got $1 million for its study. The estimated cost to design and build the New Worlds Observer mission would be roughly $3.3 billion. It is estimated that within a 2-year period the starshade could help astronomers to get a better look at upwards of 75 different planetary systems.

The Lunar Array for Radio Cosmology (LARC) project is headed by Jacqueline Hewitt, a professor of physics and director of MIT's Kavli Institute for Astrophysics and Space Science. LARC includes nine other MIT scientists as well as several from other institutions. It is planned as a huge array of hundreds of telescope modules designed to pick up very-low-frequency radio emissions. The array will cover an area of up to two square kilometers; the modules would be moved into place on the lunar surface by automated vehicles.

Observations of the cosmic Dark Ages are impossible to make from Earth, Hewitt explains, because of two major sources of interference that obscure these faint low-frequency radio emissions. One is the Earth's ionosphere, a high-altitude layer of electrically charged gas. The other is all of Earth's radio and television transmissions, which produce background interference everywhere on the Earth's surface.

The only place that is totally shielded from both kinds of interference is the far side of the moon, which always faces away from the Earth and therefore is never exposed to terrestrial radio transmissions.

Besides being the top priority scientifically for a telescope on the moon, this low-frequency radio telescope array will also be one of the easiest to build, Hewitt says. That's because the long wavelengths of the radio waves it will detect don't require particularly accurate placement and alignment of the individual components. In addition, it doesn't matter if a few of the hundreds of antennas fail, and their performance would not be affected by the ever-present lunar dust.

The new lunar telescopes would add greatly to the capabilities of a low-frequency radio telescope array now under construction in Western Australia, one of the most radio-quiet areas on Earth. This array, which also involves MIT researchers, will be limited to the upper reaches of the low-frequency radio spectrum, and thus will only be able to penetrate into a portion of the cosmic Dark Ages.

February 26, 2008

the Moving target for energy dominance

Ray Kurzweil is part of distinguished panel of engineers that says solar power will scale up to produce all the energy needs of Earth's people in 20 years.

Members of the [NAE Engineering Grand Challenges] panel are "confident that we are not that far away from a tipping point where energy from solar will be [economically] competitive with fossil fuels," Kurzweil said, adding that it could happen within five years.

"We also see an exponential progression in the use of solar energy," he said. "It is doubling now every two years. Doubling every two years means multiplying by 1,000 in 20 years. At that rate we'll meet 100 percent of our energy needs in 20 years."

I reviewed the 14 21st century engineering grand challenges and MIT's ten emerging technologies for 2008

The National Academy of Engineering has a page that discusses the challenges for economical solar power.

The US DOE has an analysis of projected energy costs until 2030 The chart shown does not have the adjustment for operating load factors. It takes three times as much wind MW to generate the same as 1 MW of nuclear power.

The total fuel costs of a nuclear power plant in the OECD are typically about a third of those for a coal-fired plant and between a quarter and a fifth of those for a gas combined-cycle plant.

In January 2007, the approx. US $ cost to get 1 kg of uranium as UO2 reactor fuel at likely contract prices (about one third of current spot price):

Uranium: 8.9 kg U3O8 x $53 472
Conversion: 7.5 kg U x $12 90
Enrichment: 7.3 SWU x $135 985
Fuel fabrication: per kg 240
Total, approx: US$ 1787

At 45,000 MWd/t burn-up this gives 360,000 kWh electrical per kg, hence fuel cost: 0.50 c/kWh.

If assuming a higher uranium price, say two thirds of current spot price: 8.9 kg x 108 = 961, giving a total of $2286, or 0.635 c/kWh.

Fuel costs are one area of steadily increasing efficiency and cost reduction. For instance, in Spain nuclear electricity cost was reduced by 29% over 1995-2001. This involved boosting enrichment levels and burn-up to achieve 40% fuel cost reduction. Prospectively, a further 8% increase in burn-up will give another 5% reduction in fuel cost.

50 GWd/t standard burn up could go up to 65 GWd/t while still 5% enrichment Up to 100GWd/t burnup could be reached with existing reactors but would need 8-10% enrichment.

Accelerator enhanced constant reprocessing would enable Ultra high burnup of 700 GWd/t. [pg 96-102 discusses Possible Transmutation Strategies Based on Pebble Bed ADS (accelerator driven systems) Reactors for a Nuclear Fuel Cycle without Pu Recycling in Critical Reactors.]

There are many advanced fission reactor designs that are in development There are several possibilities for reducing the DOE estimated overnight construction cost in half and for reducing fueling and operating costs by four times by 2015-2020. It will take several completions of any new power plants and a few years of operations before cost reductions are recognized. China has ordered four AP1000 plants for $5.3 billion. However, until several are completed the new cost savings will not be recognized. Utilities are also continuing to order other plants which may be more expensive because Westinghouse is only able to build at a certain maximum rate.

South Africa's Pebble Bed Modular Reactor (PBMR) aims for a step change in safety, economics and proliferation resistance. Production units will be 165 MWe. They will have a direct-cycle gas turbine generator and thermal efficiency about 42%. Up to 450,000 fuel pebbles recycle through the reactor continuously (about six times each) until they are expended, giving an average enrichment in the fuel load of 4-5% and average burn-up of 90 GWday/t U (eventual target burn-ups are 200 GWd/t) [start two times as effiencient with fuel and then four times]. This means on-line refuelling as expended pebbles are replaced, giving high capacity factor.

Overnight construction cost (when in clusters of eight units) is expected to be US$ 1000/kW and generating cost below 3 US cents/kWh. A demonstration plant is due to be built in 2007 for commercial operation in 2010. A design certification application to the US Nuclear Regulatory Commission is expected in 2008, with approval expected in 2012, opening up world markets.

UPDATE: More recent estimates suggest that production costs could be US$2500-3500/kW for pebble bed reactors. Inflation in the cost of steel, cement and other materials is increasing the cost of all energy production.

According to Business Report, it could cost between $9.9 billion (R67 billion) and $13.8 billion to build 24 reactor installations, which together could generate 3,960 megawatts. That's expensive power coming in at $3,500/Kw at the upper end of the cost estimate.

A larger US design, the Modular Helium Reactor (MHR , formerly the GT-MHR), will be built as modules of up to 600 MWt. In its electrical application each would directly drive a gas turbine at 47% thermal efficiency, giving 280 MWe. It can also be used for hydrogen production (100,000 t/yr claimed) and other high temperature process heat applications. Half the core is replaced every 18 months. Burn-up is up to 220 GWd/t, and coolant outlet temperature is 850°C with a target of 1000°C.

The Westinghouse AP-1000 has received several design certifications. Overnight capital costs are projected at $1200 per kilowatt and modular design will reduce construction time to 36 months. The 1100 MWe AP-1000 generating costs are expected to be below US$ 3.5 cents/kWh and its has a 60 year operating life.

Another US-origin but international project which is a few years behind the AP-1000 is the International Reactor Innovative & Secure (IRIS). IRIS is a modular 335 MWe pressurised water reactor with integral steam generators and primary coolant system all within the pressure vessel. It is nominally 335 MWe but can be less, eg 100 MWe. Fuel is initially similar to present LWRs with 5% enrichment and burn-up of 60,000 MWd/t with fuelling interval of 3 to 3.5 years, but is designed ultimately for 10% enrichment and 80 GWd/t burn-up with an 8 year cycle, or equivalent MOX core. The core has low power density. IRIS could be deployed in the next decade (2015), and US design certification is at pre-application stage. Multiple modules are expected to cost US$ 1000-1200 per kW for power generation. They expect that construction of the first IRIS unit will be completed in three years, with subsequent reduction to only two years.

The Remote-Site Modular Helium Reactor (RS-MHR) of 10-25 MWe has been proposed by General Atomics. The fuel would be 20% enriched and refuelling interval would be 6-8 years.

Another full-size HTR design is Areva's Very High Temperature Reactor (VHTR) being put forward by Areva NP. It is based on the MHR and has also involved Fuji. Reference design is 600 MW (thermal) with prismatic block fuel like the MHR. HTRs can potentially use thorium-based fuels, such as HEU or LEU with Th, U-233 with Th, and Pu with Th. Most of the experience with thorium fuels has been in HTRs. General Atomics say that the MHR has a neutron spectrum is such and the TRISO fuel so stable that the reactor can be powered fully with separated transuranic wastes (neptunium, plutonium, americium and curium) from light water reactor used fuel. The fertile actinides enable reactivity control and very high burn-up can be achieved with it - over 500 GWd/t - the Deep Burn concept and hence DB-MHR design. Over 95% of the Pu-239 and 60% of other actinides are destroyed in a single pass.

Nuclear fusion success offers the possibility of $500/kw to $20/kw of installed power. However, there is still great uncertainty of any success with nuclear fusion.

Thermoelectrics could boost the efficiency and total power generated from high heat central power such as nuclear, coal and natural gas power plants Increasing the efficiency of power plant heat conversion to 150-200% of what they are now would greatly reduce the costs of existing plants and these types of plants. The thermoelectrics have many commonalities with advanced solar power. Broad success with solar power should also mean broad success with thermoelectronics for alternative power plants. Thermoelectronics could provide an across the board boost of 30-50% in cost efficiency for nuclear, coal and natural gas by 2020.

Kitegen offers the possibility of greatly reducing the cost and increasing the total power generated by wind while reducing the materials used in construction per MW

The Uranium hydride [nuclear battery] could be mass produced at factories starting with $1400/kw prices in 2012

So there are several possibilities getting into the range of $1000/kw overnight costs for new nuclear reactors. Advanced thermoelectronics and further advances in nuclear fuel and nuclear design could provide $500/kw prices in 2020-2030 and would have far lower variable and operating costs. Nuclear fusion could push off the day of solar power price supremacy indefinitely into the future. This will not matter if we are building nuclear fission with far less waste and no air pollution, or clean aneutronic nuclear fusion or efficient wind power. Any future with clean and abundant power would be a pretty good future.

February 25, 2008

Tensilica configurable processors could make affordable petaflop and exaflop supercomputers

Lawrence Berkeley National Lab researchers are looking at is configurable processor technology developed by Tensilica Inc. The company offers a set of tools that system developers can employ to design both the SoC and the processor cores themselves. A real-world implementation of this technology. LBNL estimate that a 10 petaflop peak system built with Tensilica technology would only draw 3 megawatts and cost just $75 million. It's not a general-purpose system, but neither is it a one-off machine for a single application (like Japan's MD-GRAPE machine, for example). A 10 petaflop Opteron-based system was estimated to cost $1.8 billion and require 179 megawatts to operate; the corresponding Blue Gene/L system would cost $2.6 billion and draw 27 megawatts. Extrapolating the half petaflop Barcelona-based Ranger supercomputer to 10 petaflops, it would require about 50 megawatts and cost $600 million (although it's widely assumed that Sun discounted the Ranger price significantly). A 10 petaflop Blue Gene/P system would draw 20 megawatts, with perhaps a similar cost as the Blue Gene/L.

- AMD Opteron: Commodity Approach - Lower efficiency for scientific applications
offset by cost efficiencies of mass market
• Popular building block for HPC, from commodity to tightly-coupled XT3.
• Our AMD pricing is based on servers only without interconnect
- BlueGene/L: Use generic embedded processor core and customize System on Chip
(SoC) services around it to improve power efficiency for scientific applications
• Power efficient approach, with high concurrency implementation
• BG/L SOC includes logic for interconnect network
- Tensilica: In addition to customizing the SOC, also customizes the CPU core for
further power efficiency benefits but maintains programmability
• Design includes custom chip, fabrication, raw hardware, and interconnect

10 petaflops of sustained performance would cost 10-20 times more, which would be available for the same price in 5 years with Moore's Law.

So by 2012-2013, a 100-200 petaflop peak performance supercomputer based on configurable processors would be $75 million and an exaflop supercomputer would be in the $375-750 million range in 2012-2013.

The development of a lot of petaflop affordable power in supercomputers would help fulfill a couple of my computing predictions from 2006

10 petaflop supercomputer by 2012-2013
Petaflop personal computers and wearable computing 2016-2018

Personal petaflop machines seem likely to come about from better GPGPUs, FPGAs and mainstreaming several configurable components.
Another breakthrough is for four times as much memory in cheaper servers. More memory is needed for high performance applications

New memory controller allows four times as much memory to be placed into existing servers

MetaSDRAM is a drop-in solution that closes the gap between processor computing power, which doubles every 18 months -- and DRAM capacity, which doubles only every 36 months. Until now, the industry addressed this gap by adding higher capacity, but not readily available, and exponentially more expensive DRAM to each dual in-line memory module (DIMM) on the motherboard.

The MetaSDRAM chipset, which sits between the memory controller and the DRAM, solves the memory capacity problem cost effectively by enabling up to four times more mainstream DRAMs to be integrated into existing DIMMs without the need for any hardware or software changes. The chipset makes multiple DRAMs look like a larger capacity DRAM to the memory controller. The result is "stealth" high-capacity memory that circumvents the normal limitations set by the memory controller. This new technology has accelerated memory technology development by 2-4 years.

Powerpoint describing the Berkeley National Lab plan for customized chips for more efficiency and powerful supercomputers

Research paper on the IBM Kittyhawk project to build a global scale computer IBM wants to use supercomputers to handle many kinds of large scale applications more efficiently than with clusters of boxes.

A glimpse of how this might take shape was revealed in a recent IBM Research paper that described using the Blue Gene/P supercomputer as a hardware platform for the Internet. The authors of the paper point to Blue Gene's exceptional compute density, highly efficient use of power, and superior performance per dollar. Regarding the drawbacks of the current infrastructure of the Internet, the authors write:

At present, almost all of the companies operating at web-scale are using clusters of commodity computers, an approach that we postulate is akin to building a power plant from a collection of portable generators. That is, commodity computers were never designed to be efficient at scale, so while each server seems like a low-price part in isolation, the cluster in aggregate is expensive to purchase, power and cool in addition to being failure-prone.

The IBM'ers are certainly talking about a more general-purpose petascale application than the Berkeley researchers, but one aspect is the same: ditch the loosely coupled, commodity-based systems in favor of a tightly coupled, customized architecture that focuses on low power and high throughput. If this is truly the model that emerges for ultra-scale computing, then the whole industry is in for a wild ride.

Exaflop computer studies

$7.4 million in funding has been provided for researchers at Sandia and Oak Ridge National Laboratories to look at issues for exaflop computers They are preparing for the challenges of developing an exascale computer at the new Institute for Advanced Architectures.

One such challenge is power consumption. "An exaflop supercomputer might need 100 megawatts of power, which is a significant portion of a power plant," said Dosanjh. "We need to do some research to get that down. Otherwise no one will be able to power one."

Then there's the issue of reliability, which tends to decline as the parts count increases. Given that an exascale computer might have a million hundred-core processors, Dosanjh speculated that such a machine might run for 10 minutes before suffering a failure. To manage a machine with so many parts, new fault-tolerance schemes need to be developed.

Data movement is also a critical concern, said Dosanjh. "The rate of memory access has not kept up with the ability of these processors to do floating point operations," he said.

And in addition to the hardware engineering challenges, programmers have to be educated to write code for such massively parallel systems. "As far as the industry is concerned, there needs to be an education effort as well to get people trained to write software at this scale," said Dosanjh.

Teraflop to Petaflop GPGPUs and artificial neurons for artificial vision and smell

Nvidia introduced the Cuda development tool in June 2007 to allow scientists to tap into the GPU's power. More than 50,000 users have downloaded the software

By 2012, three of the top five supercomputers in the world will have graphics processors using parallel computing applications to crunch numbers at a clip that's not possible on standard CPU-only set-ups, predicts Nvidia chief scientist David Kirk.

Since other supercomputers are projected to be at several petaflops in speed then the GPGPU enhanced machines must be targeting 5-10 petaflops in performance by 2012.

Evolved machines is using GPGPUs for enhancing neural network and neuron simulation for visual systems and for artificial sense of smell

ATI's Radeon HD 3870 X2, released last month, can hit around 1TFLOPS (one trillion floating-point operations per second). Its 320 stream processors combine for massive parallel computation.

Nvidia could have tighter co-processor coupling between its GPGPUs and apple computers.

Fusion and expected power generation trends

Vincent Page, technology officer at GE, wrote a good paper in 2005 about the economics and timeline towards moving to fusion power

He had three fusion concepts in a chart which I will extend

Time to Small Cost to Achieve Large scale chance
Concept Description Scale net energy Net Energy after small success Funded?

Bussard IEC Fusion 3-5 years $200 million 90% Y, $2m
My intro to Bussard fusion and update on prototype work

Tri-alpha Energy aka 8 years $75 million 60% Y, $50m
Colliding Beam fusion aka
Field Reversed Configuration
My review of the academic research before the funded stealth project

General Fusion aka 3-6 years $10-30 million 60% Y, $2m
Magnetized target fusion
Steam generated shock wave into spinning liquid metal

Plasma Focus 6 years $18 million 80% Y, $1.7m
Focus fusion website
Focus fusion US patent application
Working on a funded experiment with Chile 2006-2010

Multi-pole Ion beam
version of Bussard IEC 3-5 years $200 million 90% N
FP generation MIX IEC fusion

Koloc Spherical Plasma 10 years $25 million 80% N (self)
Attempt to create stable ball lightning plasma balls
In 2004, trying to generate 30-40cm plasma spheres

IEC bussard fusion can be estimated to cost $200 million for the first 100MW system.
IEC bussard fusion should scales well to 1-10GW sizes.
General Fusion has discussed $50 million devices for generating 100MW.
Focus fusion has talked about 20MW reactors for $500,000 and 1/20th of cent per kwh.

I expect that over the next couple of months there will be more positive test results from the WB-7 IEC prototype reactor. I think that the IEC bussard fusion multi-pole variant (from a researcher who previously worked with Bussard on his reactor) will then also get funded. I think some of the projects with minimal funding will get more support as other countries and companies step into the alternative nuclear fusion power generation race.

The tweaks to the IEC fusion system are to increase ion and electron densities (from the multi-pole (MIX) site:

With higher densities, electrons and ions can arrange themselves in alternating layers of positive and negative charge, forming "virtual electrodes" that can result in yet higher densities of ions at the center of the machine, and a trapped ion population that never intersects any material structure. Evidence for this effect has previously been observed in operating IEC machine.

The addition of a small radio frequency modulation of the cathode voltage will drive trapped ions to converge simultaneously at megahertz rates in the very center of the machine at high energies, provided a harmonic electric potential can be maintained inside the cathode, an effect called POPS (Periodically Oscillating Plasma Sphere) that has been documented in previous IEC experiments.

Pulsed operation will potentially raise the fusion rate still further.

We have plans to extract ions which have developed non-ideal orbits at low energy, thus substantially increasing the energy confinement time and further raising efficiency.

Older power generation systems and projects will not be abandoned. It would still take a long time to replace the old power even if nuclear fusion has a breakthrough. Plus the new fusion power systems will need to gather several years of operating record so that people know exactly what their cost and safety record is. Fission systems such as the Hyperion Uranium Hydride system (nuclear battery) would still have a niche helping generate power for enhance oil shale and oil sand recovery during a transition phase which could last two or three decades.

Increased funding could accelerate these projects by 1-3 years. For the IEC fusion systems net energy production systems are basically full scale commercial systems.

DOE central power analysis

February 24, 2008

Barrons covers Bakken oil, says not fully valued in oil stocks

Barrons (one of the leading financial newspapers) has an article about the Bakken Oil in North Dakota

I have had five articles on Bakken oil starting with this one This blog gave its readers a one month head start on Bakken oil.

Barrons Kopin Tan says:

The Bakken is no longer an undiscovered gem: Exploration companies with local perches - including EOG Resources (EOG), Continental Resources (CLR), Whiting Petroleum (WLL) and Brigham Exploration (BEXP) span the market cap spectrum from big to small have seen their shares rise 44% to 85% over the past six months.

Those companies do not include the oil companies that are working the Saskatchewan, Canada side of the Bakken oil play.

Despite the stock spurt some see further upside in the longer term. "The newness of he [Bakken] play has analysts giving credence only to acreage tha has been drilled successfully" says David Morehead, a senior portfolio manager at the small hedge fund Highview Capital. "We do not believe the Street has fully valued the Bakken drilling that has already been permitted, let alone the acreage held in portfolios that has yet to be permitted and drilled"

Whiting has three successful wells at Lake Robinson. With its third well initially producing 2,530 barrels a day, 53% more than the second well. It plans to deill at least 30 wells here in 2008.

Continental is more leveraged to Bakken (36% of net assets), followed by Brigham 16% and EOG 11% and Whiting 11%.