Pages

January 10, 2009

Oil Supply, Demand and Price


Business Week has numbers of oil supply, demand and price which if accurate would indicate a continuation of relatively moderate oil prices through 2010.



Beyond 2010, this site has noted a large supply of $35-65 oil and oil substitutes.

1. Petrobanks THAI/Capri and other oilsand and heavy oil processes could make over a trillion barrels of oil affordable at $20,000 per producing barrel. The oil industry spending of over $200 billion/year on exploration and development would be able to shift over to this process and access known oilsand and heavy oil supplies. The per barrel price using all of the exploration and development budget would add ten million barrels per year added to offset declining oil fields. No exploration risk would be needed because we know where the oilsands are, then it is just develop them as cheap and fast as possible.




2A. Multi-stage horizontal fracturing can be used to access oil in the Bakken Formation and older wells. Old wells in the USA have 360 billion barrels left in them, which have not been affordable to access. Multi-stage horizontal fracturing lowers the costs by 2-3 times.

2B. Multi-stage horizontal fracturing on a large scale would need a lot of material for injectant and a large scale carbon capture storage appears to be affordable according to MIT.

3. Coal gasification and liquification is coming on stream and could be million barrel per day additions and growing from 2016+.

4. Third and fourth generation biofuel processes appear also to be on track to significant scale and affordability.

Nuclear power could provide power for coal liquification and oilsand recovery and advanced biofuel processes.

Petrobank Capri / Thai Processes for Upgrading and Recovering Oil Getting Closer

Haywood Senior Oil and Gas Analyst Alan Knowles is positive on Petrobank and its new Thai/Capri oil recovery processes:
The Petrobank oil recovery technologies (Thai/Capri) is actually cheaper; for instance, the estimates are that this will cost $20,000 per producing barrel to put a project together, and it likely will be less, whereas your average SAGD (steam assisted gravity drainage) project is $60,000 per producing barrel. So, it's a third, and if you add an upgrader, you can get into the $80–$100,000 per producing barrel. And so you're comparing $20,000 versus those higher numbers, depending on the project.

Petrobank's deal has been to retain a 50% working interest in, and a 10% override on, the other company's project. The benefit of that, of course—it adds to the company's growth. The other benefit is that it also recovers 70% of the oil in place in the reservoir, and a competing or just a standard expectation is somewhere between 25% and 40%, depending on the project. So they get almost twice as much oil out of the ground from their well than a competing project would get from their well.

The oil is upgraded in situ in this process, so instead of producing 9- or 10-degree oil, it produces 20- to 21-degree oil—a significant advantage; and capital costs are a third to a quarter of competing projects. This is a new technology and it is getting a lot of attention now, especially in the lower oil price environment







Bakken still works at these prices. Crescent Point, Petrobank, TriStar, and Petrominerales can really do something with $65 oil because of their low royalty, low operating costs. Crescent Point and TriStar are involved in the same play here in Canada, called the Bakken play, and that's light oil, so it receives quality pricing. The first year's royalty on those wells is only 2.5%, and the operating costs are in the $8 to $9 range, generally. Even in a $45 oil environment, they have a decent net back but the rate of return on a well at $65 versus $45 is significantly better; so, they wouldn't stop drilling but rather slow it down in a $40-45 oil environment.




THAI Process Benefits
• Minimal natural gas and water use
• Higher recovery rates - 70-80% of oil in place
• Improved economics
• Lower capital cost – 1 horizontal well, no steam & water handling facilities
• Lower operating cost – negligible natural gas & minimal water handling
• Higher netbacks for partially upgraded product
• Faster project execution time
• Lower environmental impact
• 50% less greenhouse gas emissions
• Net useable water production
• Partial upgraded oil requires less refining
• Smaller surface footprint
• THAI /CAPRI - step change heavy oil technologies
• Up to 804 mmbbls recoverable (based on SAGD) in Petrobanks Whitesand block


Petrobank is also big in Saskatchewans part of the Bakken Oil Formation



January 09, 2009

Cellulostic Ethanol and Then Bio-syntrolysis Fuel

Second generation cellulosic ethanol pioneer Verenium has started production of ethanol from non-food sources such as wood chips, grass straw, and trash at their Jennings Louisiana demonstration plant (PDF). This is the first such plant to begin operation in the US.

The plant will produce 1.4 million gallons of ethanol a year. Although it’s not at the commercial scale yet (60+ MGY). According to the Renewable Fuels Association, there are around a dozen second generation cellulosic ethanol demonstration plants that will be opening the USA between now and 2012. Range Fuels plans on opening the first commercial scale cellulosic plant in Georgia by the end of 2009.

Greencar Congress has a report on Idaho National Labs work to bring about bio-syntrolysis which would be two to three times more efficient than cellosic ethanol.

Idaho National Lab (INL) is researching bio-Syntrolysis which would convert about 90% of the carbon in biomass to liquid synthetic fuel. Conventional biomass or coal gasification to liquid fuels converts only ~35% of the carbon to liquid fuel.

In Bio-Syntrolysis, process heat from the biomass gasifier produces the steam to improve the hydrogen production efficiency of the HTSE process, while the biomass itself is the source of the carbon. Hydrogen from HTSE allows a high utilization of the biomass carbon for syngas production, while the oxygen resulting from water splitting is used to control the gasification process. The new process is an evolution of INL’s earlier work on co-electrolysis (Syntrolysis).

Syntrolysis used high-temperature electrolysis with a solid-oxide electrolysis cell designed to take advantage of electricity from nuclear or renewable energy sources and industrial process heat to simultaneously convert water and carbon dioxide into syngas.

INL is proposing locating Bio-Syntrolysis plants regionally, close to where the biomass is grown. A 25,000 barrel (1.05 million gallon US, 3.974 million liter) per day plant for full biomass to liquid fuels would entail a capital cost of around $2 billion and an annual operating cost of $1 billion per year.

A preliminary, yet thorough economic analysis shows synthetic, liquid, no-sulphur diesel at $2.50 per gallon.










Technology Blogger Roland Piquepaille has Died

Technology Blogger Roland Piquepaille has died

He had his own blog, Roland Piquepaille's Technology Trends and those articles were often contributed to zdnet as well and were often well featured at Slashdot.

From his zdnet bio:

Roland Piquepaille lives in Paris, France, and he spent most of his career in software, mainly for high performance computing and visualization companies, working for example for Cray Research and Silicon Graphics. He left the corporate world in 2001 after 33 years immersed into it. In 2002, he started a blog about technology trends and how they will affect our lives. This blog is now hosted by ZDNet, part of CNET Networks, under the name Emerging Technology Trends, and continues to explore the frontiers of science and technology. In 2005, Roland started another blog focused on why it makes sense for a company to use blogs, Blogs for Companies, which is temporarily on hold.





Roland was a thorough researcher and was a much appreciated voice in technology and futurist blogging.

ZDnet confirmed his passing from his wife.

Roland passed away Monday in Paris. He was hit with a digestive virus that lead to a high fever and health complications beyond that. Suzanne said that the doctors are still trying to quantify how Roland got the virus and the exact details.

Choice of High or Low End Roving Teleconference Robots

$30,000 will buy a high end life-sized video conferencing system on wheels, giving users a feeling of tele-presence. It sports two 5 Mpixels eyes, full voice telephony to speak and hear and a laser pointer as a finger.



Users can control the sleek five-foot, 35 pound robot remotely from any web browser using nothing more than a mouse. The Wi-Fi-based system aims to let remote users amble around offices that could be across town or across the globe, interacting with fellow employees.


Rovio, by WowWee, has a WiFi enabled mobile webcam that can see, speak and hear from anywhere in the world. This $300 system (Walmart and other places)was covered in the interview this site had with Faysal Sohail, managing partner of the funding venture capital firm CMEA. A separate roving robot with monitor would not be that much more expensive. Seems like it should be possible to upgrade the Rovio to about $1000 and get most of the features of the Anybots QA.





FURTHER READING
Anybots website



Rovio website


User hacked his Rovio to Add Better Lighting


Rovio Tech Specs
Rechargeable NiMH battery pack included
1 x charging dock with built-in TrueTrack beacon
3 x wheel motors
3 x omni-directional wheels
1 x camera motor
1 x VGA CMOS sensor
LED illumination
1 x speaker
1 x built-in microphone
USB connectivity
WiFi connectivity (802.11b and 802.11g)

January 08, 2009

Pennwest Using Horizontal Multi-Stage Frac Drilling Outside of Bakken Oil Formation

A presentation by Penn West, a Senior Canadian Oil Company, on its oil drilling plans. [34 pages] They are making a significant allocation to enhanced oil recovery.



Horizontal multi-stage fracturing can be 44% of the cost of conventional oil drilling approaches for certain oil formations. Horizontal multi-stage fracturing should draw billions of barrels of oil from older oil formations

What we're doing in the Pembina field will be transferable to a large percentage of conventional reservoirs in the Western Canadian Sedimentary Basin." Many older oilfields have unswept and high-pressure areas that appear to be strong candidates for horizontal injection.

Jeff Saponja, president and CEO of TriAxon Resources Ltd., points out that an agile junior producer can also participate in this technology-driven game. "When multi-stage fracs clearly became viable in the Bakken [tight oil] and Montney [tight gas] formations, we jumped early into evaluating the potential of this technology for more conventional reservoirs," the professional engineer says.

Saponja cites two primary operating risks in horizontal multi-stage hydraulic fracs. First, the frac equipment must be placed in position at the end of the wellbore without developing leaks, getting stuck, or settling short of the bottom. Once placed, 8 to 11 frac stages must be performed sequentially (typically at 200 m apart) without sanding off or screening out at any stage.

"The goal is to place 10 to 20 tonnes in a nice, uniform pack, enough proppant to sustain a good [oil] flow through the reservoir," Saponja says. In wells that typically cost $2 million to $3 million apiece, halting pumping operations so a coiled tubing unit can clean out problems can quickly render the operation uneconomic. Strengthened by success in the Bakken, TriAxon had the confidence to evaluate more than 100 conventional reservoirs across Alberta and Saskatchewan as candidates for the new frac technology.

"We looked for prospects with large oil in place, low primary recovery rates, and multiple stratified layers within the reservoir," the TriAxon founder says. Historically, many conventional stratified reservoirs have responded poorly to horizontal drilling because their natural porosity and permeability tend to be horizontal plane rather than vertical.

Reservoirs with underlying water drives should be avoided; water will flow upward into a horizontal wellbore if frac stimulation opens vertical paths in the rock.


Horizontal multi-stage fracturing has been essential for the recent boom in the development of the Bakken Oil Formation. Now companies like Penn West are applying it to older oil fields to enhance the oil recovery there at a lower cost than prior oil drilling methods.












FURTHER READING
Broader overview of enhanced oil recovery

The USA has 374 billion barrels of stranded oil.

Petrobanks Thai/Capri process for getting oil from oilsands and upgrading it underground.

Coal Ash Spill 5+ Million Tons and Arsenic Laced Sludge Spilled


About a half a square mile is covered with sludge.

Coal ash is recycled into products such as cement or placed in secure landfills, but much of it ends up in gravel pits, abandoned mines and unlined landfills — or in ponds like the one that burst in Kingston, Tenn., on Dec. 22. In the Tennessee incident, 5.4 million cubic yards of sludge laced with arsenic and other toxic materials poured over 300 acres — making it one of the nation's worst environmental spills.

The EPA in 2000 decided that coal ash wasn't hazardous waste and left regulation up to the states. The Kingston Fossil Plant was the largest coal-burning power plant in the world when it began operating in 1955. The plant normally consumes about 14,000 tons of coal a day. There are about 600 coal ash disposal sites — about 45 percent of them surface ponds, and the rest landfills. There are about 300 surface ponds at electric power plants like the one in Tennessee.









Coal combustion waste is estimated at more than 129 million tons a year, she said. The problem, she said, is that because of a lack of federal oversight, "we don't know where it goes."


FURTHER READING
There is coal ash and sludge from the coal plants and there is sludge from the mountain top removal mining. Mining sludge dam broke and killed 125 people at Buffalo Creek. Pictures and article through this link.

Wall Street Journal 2008 Technology Innovation Awards

The Wall Street Journal 2008 Technology Innovation Awards

Some of the winners have been covered on this site.

Nanocomp Technologies, maker of large sheets of carbon nanotubes won in the materials category.





India's Tata which makes the Tata Nano car won in the transportation category

Anti-Peak Oil: Large Scale Liquified and Gasified Coal Competitive with Oilsand and Deepwater Oil

The large scale cost of getting a lot more oil is currently $200-400 billion to get each 100 billion bbl. 100 billion barrels is about a 14 year supply of oil for the United States. It is about a 3.5 year supply of oil for the world. The Oil industry's 2008 -pre-crash global exploration and development spending was $260 billion/year.

Recently there have been several major announcements in regards to coal liquification and coal gasification. Coal is getting cost competitive with oil from the oilsands and as way to produce more natural gas. Natural gas is already generated from coal.

Pro: is the scenario of the collapse of civilization because of peak oil would be avoided for some number of decades allowing more time for a transition to nuclear fission, nuclear fusion and renewable power
Con: It is still fossil fuel and while less polluting than solid coal they are still polluting (therefore deadly) and still a problem for climate change. Coal gas would potentially be four times or more better in terms of CO2/climate change.
Come up with something more profitable and better: There is big money (multi-billions) going down these paths. Alternatives have to be more profitable and better for the climate/with better for climate made more profitable with new energy policy like carbon taxes and cap/trade policies.

The energy winner and market share is about money, profit and the insatiable need for liquid energy to power the machines of civilization. Coal is providing affordable answers at the needed scale. Hopefully other tech can step up for a better environment etc... but these coal options appear to be getting to the right cost and scale to be the stopgap depending upon the timing of peak oil and the readiness of nuclear power and renewables at suitable cost and scale.

This site does not like that 3 million people die each year from fossil fuel pollution. This site would dislike the collapse of modern technological civilization even more as it would result in even more deaths.

1. South African company Sasol has a $10 billion project in Indonesia for the 80,000 barrel per day first stage, and then escalating to 1.1 million barrels per day of oil replacement from coal starting in 2015 (80,000 bpd in 2014-2015 and then escalating to 1.1 million bpd at an unknown point and cost). This price would be about 25% cheaper than prior smaller scale coal to liquid efforts.
- Kentucky is considering a substantial coal to liquid effort with nuclear power providing the energy. It would take longer (2025) and would be smaller but probably cost more. This cost difference is in the technology and because projects cost more in the USA versus countries like Indonesia and China.

$200-400 billion for CTL at the Indonesia/Sasol project price would be 2-4 million barrels per day. The duration and number of barrels depends upon the coal reserves at the location. The amount of global and country coal reserves is disputed. Indonesia could probably produce 60-120 billion bbl of oil replacement from its likely coal reserves. The process is competitive with $35/barrel oil.





Sasol will work with several local raw materials firms, including coal miner PT Bumi Resources Tbk., Jakarta, and state-owned oil firm PT Pertamina, to produce oil products from coal.

Initial output is projected at 80,000 b/d, eventually rising to 1 million b/d, according to the Department of Energy. Officials said the liquefaction plants will use low-priced lignite. "There will be no profitability problems so long as crude oil prices remain above $35/bbl." Indonesia has an estimated 36 billion tons of lignite or about 60% of the country's coal deposits.


Details on the new coal liquification processes

Prior projections (from 2006) of worldwide liquid coal from Newsweek are:

150,000 barrels a day today to 600,000 in 2020, and 1.8 million barrels a day in 2030

Will new Sasol deals and higher oil prices drive faster growth ?

Coal prices charted through 2008

2. Several companies are turning coal into gas underground by understanding and tuning the natural microbes that do this already. This appears on track to be scaled up to massive industrial scale. Gas in coal formations is already 10% of US natural gas and is expected to increase to 30-50% of annual US natural gas supply. Gas from coal is potentially as good environmentall as regular natural gas. The current US rate of natural gas usage could be extended by 10,000 years. Understanding how natural gas is formed in coal formations also allows for adjustment to the extraction process to not kill the valuable microbes making the coal into gas and extending the life of gas mines in coal formations.

Enhanced Oil Recovery
The economics favor Enhanced Oil Recovery (EOR) today, especially at a time when oil prices are at stratospheric levels -- as are oil and gas companies' operating costs.
Sandrea estimates that EOR could add reserves at a capital expenditure of $2-4/bbl, compared with about $4-6/bbl for deepwater development, almost $13/bbl for acquisitions, and more than $14/bbl for overall global finding and development costs.
He estimated that industry would need to spend $200-400 billion to improve the world's average recovery rate by a single percentage point to recover that incremental 100 billion bbl. That compares with industry's current global E&D
spending of $260 billion/year.
While deepwater and ultradeepwater exploration and development has garnered headlines with spectacular successes, Sandrea's study pointed to geological evidence that, to date, suggests the deep water is a play with limited prospectivity within a global offshore context
.

Multi-stage fracture oil mining is being used in oil wells outside the Bakken Formation where they have been used to multiply reserves by orders of magnitude. The multi-stage fracture mining could be used to get at a lot of the oil in abandoned wells in the US and other places. These processes and prospects will be reviewed in more detail in a separate article that will be available very soon.

New Ways to Leverage Nuclear Power and Coal

1. Kentucky is Considering Using Nuclear Power For Coal to Liquid Process

The Kentucky plan proposes a goal of 50 million tons of coal used per year to produce 4 billion gallons of liquid fuel per year by 2025. [100 million barrels per year or about 300,000 barrels per day]

The plan proposes that Kentucky evaluate and deploy technologies for carbon management for use in 50 percent of coal-based energy applications.

The Kentucky Plan is smaller and slower than the 1.1 million barrel per day Sasol coal to liquid project in Indonesia for 2015

Idaho Samizdat indicates that a suggestion put forward by the Kentucky plan is that "a moderate investment" in nuclear power (eight plants at four sites) could be considered as part of a strategy to diversity Kentucky's future electrical energy portfolio, reduce emissions. Taking a hypothetical case of building eight 1,000 MW plants by 2025, that would require an investment of $28-35 billion dollars in the next 17 years.

2. Turning More Underground Coal into Methane/Natural Gas Using Microbes
From MIT Technology Review: Luca Technologies, a startup based in Golden, CO, has raised $76 million to scale up a process that uses coal-digesting microorganisms to convert coal into methane. The process is designed to operate underground, inside coal mines. Methane, the key component of natural gas, can then be pumped out and used to generate electricity or power vehicles. The company has tested its methods in coal beds where wells had been drilled to collect natural gas (about 10 percent of the natural gas mined in the United States comes from coal beds). Many of these wells had stopped producing natural gas, or produced too little to be profitable. After treatment, production increased, and the wells became profitable again

If the process proves economical, it could help reduce carbon-dioxide emissions, since burning natural gas releases half as much carbon dioxide as does burning coal. It could also reduce or eliminate the anticipated need to import natural gas in the future, says Gary Stiegel, the technology manager for gasification at the National Energy Technology Laboratory, in Philadelphia. As little as one-hundredth of 1 percent of the coal in the United States converted into methane by microbes would supply the country's current annual natural-gas demands. US Coal reserves would provide a 10,000 year supply of natural gas at todays current usage rate of natural gas.




3. Russia's Gazprom cut back gas supply into Ukraine, a growing shortage in Europe has resulted in calls to re-open shut-down nuclear power plants.

Bulgarian President Georgi Parvanov said work would start immediately to prepare to restart the third reactor at the Kozloduy nuclear power plant.

Slovakia, prime minister Robert Fico said he "could imagine" the re-opening of a shut-down reactor at Jaslovske Bohunice.


4. 2008: Three reactors shut, ten more begin construction

The reactors to be restarted from the third news article are one of the reactors shut down in Dec 2008.

The construction of ten other nuclear power reactors commenced during 2008, mainly in Asia, but also in Russia. In China, construction began on six new units: Hongyanhe 1, Fuqing 1, Ningde 1 and 2, Yangjiang 1, and Fangjiashan 1. Construction also started on Shin Wolsong 2 and Shin Kori 1 in South Korea. In Russia, building of two new units began: Leningrad II-1 and Novovoronezh II-1. Together, these units have boosted the total number of new reactors under construction worldwide to 43 (with a total capacity of 37.6 GWe), up from 33 (26.6 GWe) a year earlier.

Construction resumed on Slovakia's long-stalled Mochovce 3 and 4 reactors. The reactors will add 440 MWe each to Slovakian generating capacity and restore its status as an electricity exporter when complete in 2012 and 2013.

Slovakia has maintained that the reactor were safe and could have continued operating for at least another 10-15 years. The 408 MWe Bohunice 2 reactor - the second unit of the V1 plant - in Slovakia was shut down on 31 December as a condition of the country's accession to the European Union (EU).

5. Construction of two CPR-1000 reactors were started in China provide another 2160 MWe. The dates scheduled for the start of their commercial operation are December 2013 and October 2014.

January 07, 2009

BC Business Has Extensive Dwave System Quantum Computer Coverage

BC Business has extensive Dwave System [Quantum Computer maker] coverage.

30 generations of quantum computing machines have been made by D-Wave Systems Inc., the makers of the world’s first commercial quantum computers.

DFJ backed such projects as Skype and Hotmail, and started investing in D-Wave in 2003. Jurvetson, who now sits on D-Wave’s board, believes its machines will leave conventional supercomputers in the dust within five years. “Almost all the big winners in the high-tech field seem crazy at first, so the fact that this is an unusual technology right now is a big draw for us. Especially a commercial one like this that has the capability of being more powerful, more flexible and have much more longevity than any computer we’ve seen before.”

The company is currently fabricating 128-qubit chips, which they claim will be about 100 times faster than an off-the-shelf $5,000 conventional computer for solving certain tricky computing problems. But in order for these kinds of quantum machines to become exponentially faster than today’s conventional computers, they will need to scale their technology up to thousands or even millions of qubits. D-Wave plans to have a 1,000-qubit system operating by the end of 2009 that would bump the technology out of the R&D phase and into the real world – appealing to a variety of corporations, including Internet search engines, banks, investment firms and insurance brokers, as well as logistics, travel and pharmaceutical companies. A few dozen academics in robotics and bioscience, and a handful of corporations, including industry goliath Google Inc., are already using D-Wave’s quantum machines.




The four most promising [types of quantum computers] systems developed so far have used trapped ions, electrons in semiconductors, photons or superconductors. D-Wave chose to go the superconducting route: cooling superconducting metal – in D-Wave’s case, loops of mostly niobium – to nearly absolute zero to cause the quantum behaviour.

Meanwhile, D-Wave is facing its own qubit-related constraints issues, as the company’s objective to “go up to millions or tens of hundreds of millions of qubits” butts up against the physical restriction presented by chip size. According to Rose, even these thumbnail-sized qubits are quite large, and shrinking them makes it more difficult to couple them to other qubits and other necessary devices. D-Wave has scaled up through 30 generations of processors to get from 16 qubits to 128 with its newest chip, which the company will soon begin manufacturing in-house. “We can fit roughly 2,000 qubits on our current processor, which is about the limit of where we can go with the current design,” admits Rose. “After that’s achieved, we need to have some other method of going to larger numbers. So the next step in the redesign – or the evolution of the technology – is getting to millions of qubits.”

Unless, that is, D-Wave runs out of investor money first. “I’ve lived through many economic crises, so the next year could be tough for D-Wave,” admits Farris.
Corporate America and the IBMs of the world have said, ‘Everybody will be knocking on your door when you get to 500 qubits.’ But that might change in this economy.

Blacklight Power Response to Eli Rabett

Eli Rabett claims that a pure standard chemical reaction can explain Blacklight Powers energy generation

UPDATE: Dr Mills has responded to an analysis by Rabett"

It is not a question of more accurate heats of formation. They are accurate to at least three significant figures. It is a question of using the correct ones. That was not done as evidenced the the difference between Rabett's erroneous calculated nickel hydride decomposition energy of a "Net heat of reaction per mole of H2 generated= 2*240 kJ/mol - 436 kJ/mol - 204 kJ/mol = -160 kJ/mol (an
exothermic reaction)" and the experimental result of +8.8 kJ/mole H2 [B. Baranowski, S. M. Filipek, "45 years of nickel hydride‹history and perspectives", Journal of Alloys and Compounds, 404-406, (2005), pp. 2-6.]. This post should be redacted at each site that it is posted. Presuming Rabett is really a professor, I'm sure such an
obvious and fundamental mistake would not be tolerated.


Blacklight Power continues to sign commercial contracts with small energy utilities.

Mills says BlackLight has operated the reactor continuously for two hours and that it’s investigating a new type of fuel that yields 10 times as much energy per weight as the sodium hydroxide–doped Raney nickel. He insists the company has disclosed the experiment in detail in a paper available on its Web site, only retaining “some know-how in order to maintain our technical lead.” He says BlackLight is “open to host validators” and is “willing to supply the fuel under an academic license or commercial license.” Eventually, he contends, others will be able to make the fuel themselves.

Pilot plants projected for mid- to late 2009.


So the regular chemistry theory would have an every tougher time explaining ten times as much energy by weight as the sodium hydroxide–doped Raney nickel work.






FURTHER READING
What if Blacklight Power works in 2009 ?

MIT Writes Positively About Scaling Carbon Capture and Storage and Nuclear Power to Address CO2 Problem


Between 11,000 and 23,000 miles of dedicated CO2 pipeline would need to be laid in the United States before 2050, according to PNNL's estimates, in addition to the 3,900 miles already in place (which carry mostly naturally occurring CO2 used to stimulate production from aging oil wells).

The MIT Future of Coal 2007 report estimated that capturing all of the roughly 1.5 billion tons per year of CO2 generated by coal-burning power plants in the United States would generate a CO2 flow with just one-third of the volume of the natural gas flowing in the U.S. gas pipeline system.



There has been identified capacity to store 3.9 trillion tons of CO2, mostly in saline formations in the USA. There are about 1700 large source of CO2 generation which produce 2.9 billion tons of CO2. In comparison the human CO2 generation for the world is estimated at 26 billion tons per year. Currently only a few million tons per year of CO2 are being captured and stored. The current world nuclear power of 372 GW (producing 2608 billion kwh) is preventing 2 billion tons of CO2 per year if that same power was provided by coal plants.







A recent MIT article and the MIT's Joint Program on the Science and Policy of Global Change indicates that based on an assessment of currently available technology and pricing (2007 prices and expected cap and trade or carbon taxes) that the two top methods out to 2050 are nuclear power and carbon sequestering for reducing human generated CO2 to 80% of 1990 levels. So the 2100 disaster would be avoided by 2050 even using a combo of existing technology and policy. The pricing policies mainly hit coal prices.

The MIT/PNNL studies aer saying make a massive carbon sequestering effort on the order of 100-500 billion tons over 40 years. There is perhaps 0.1% leakage of CO2 which would be a 500 million ton leak (in the 500 billion ton case) at that point but you would be shoving 15 billion tons per year [worldwide] into the ground or someplace else (a 3% penalty to overcome at that point, for the leak versus annual amount stored). It would be a constant effort while the time is taken to shift to a de-carbonized energy infrastructure. In 2100, there would be about a 1.5 trillion tons in the ground with a 1.5 billion ton leak. You could store 15 trillion tons before the 15 billion tons per year would only be adequate to offset the leak. In the meantime 500-1000 years would have gone by. I don't see how any society that had gone that far had not taken care of total energy source de-carbonization by
2100 at the latest.

I think it is something that works now but which is a stopgap effort until the better stuff is spun up, but worst case the stuff we have now will at least prevent the worst case scenario.

Building analogy
So a worst case scenario is that we can't get better stuff working and have to scale up what we got. And patch our problem with an expensive and relative bonehead solution but a solution that would work. The worst case scenario is not that we fail completely and all die, which is the equivalent of if you cannot get out of your multi-story apartment in 90 years which has a layer of dirt constantly falling on
it now it will fall on you and your neighbors will all die in 90 years because the roof the building will collapse. Carbon sequestering is the tenants shoveling off the dirt for a few hundred or thousand years. The CO2 that is in the air stays there for 3000 years.

So CCS is not the best hope. It is a hope and if you don't get something better then we just sweep it under the ground. It is just better than renewables like solar and wind in their current condition and projecting that condition forward unchanged for forty years.

Carbon sequestering is more a solution that results from economics and carbon taxes than from best science. Economics and policy are real things in our civilization though. It is also more of a peak oil delaying thing.

A better plan which is very different.

80% of money [energy infrastructure spending] and effort on technology that is ready now and deploying it.
20% of money and effort on developing better technology. Spreading it around on as many bets as efficiently as possible. A DARPA of energy looking to prove out home run technology.

The 100% of money is 2-4 trillion per year that will be going into energy infrastructure worldwide.
200-600 billion per year from and for the USA.

A significant portion of the 20% on factory mass produced deep burn fission. $20-40 billion seems certainly enough to develop it and to initiate build-out. Design and prove out a factory mass preducible variant on the liquid fluoride thorium reactor (like the Fuji MSR). Not even all of one year of what would be the US portion of 20% of energy infrastructure spend. Plenty left to fund many different approaches to nuclear fusion and other technology bets.

Repulsive Casimir Force: Casimir-Lifshitz force experimentally verified


A repulsive quantum force, opposite of casimir force, has been verified and measured. This is the cover story, [Measured long-range repulsive Casimir-Lifshitz forces], of the Jan 8, 2009 Nature journal and is from Harvard researchers: J. N. Munday, Federico Capasso & V. Adrian Parsegian.

Controlling the casimir forces is potentially a huge capability and DARPA has been funding work on it.

Last year creating a small scale comb allowed the casimir force to be reduced by 30 to 40%

Sufficient control of the casimir force could enable a breakthrough in space propulsion and energy extraction from the vacuum and highly efficient energy conversion. It could also make nanoscale machines work better with less or more friction as needed. Ultra-low friction bearing are also very high potential.

"When two surfaces of the same material, such as gold, are separated by vacuum, air, or a fluid, the resulting force is always attractive," explained Capasso.

The scientists replaced one of the two metallic surfaces immersed in a fluid with one made of silica, the force between them switched from attractive to repulsive.

Note: it is theorized that metamaterials can reverse and control the amount of casimir force.

The experimental verification that a bizarre quantum effect — the Casimir force — can manifest itself in its repulsive form is pivotal not only for fundamental physics but also for nanotechnology.

In 1948, Hendrik Casimir predicted that two uncharged, perfectly conducting plates in a vacuum would be attracted to each other because of quantum fluctuations in the vacuum's electromagnetic field between the plates. Generalized for real materials by Evgeny Lifshitz2 in 1956, Casimir's prediction has been verified many times and is now known as the Casimir–Lifshitz (C–L) force.


The Nature editors have summarized this Quantum levitation effect

Space is not completely empty; the vacuum teems with quantum mechanical energy fluctuations able to generate an attractive force between objects that are very close to each other. This 'Casimir–Lifshitz' force can cause static friction or 'stiction' in nanomachines, which must be strongly reduced. Until now only attractive interactions have been reported but in theory, if vacuum is replaced by certain media, Casimir–Lifshitz forces should become repulsive. This has now been confirmed experimentally. Repulsion, weaker than the attractive force, was measured in a carefully chosen system of interacting materials immersed in fluid. The magnitude of both forces increases as separation decreases. The repulsive forces could conceivably allow quantum levitation of objects in a fluid and lead to new types of switchable nanoscale devices with ultra-low static friction. Levitation depends only on the dielectric properties of the various materials.




6 pages of supplemental information on the experiments is here

Interstellartech Corp: Trying to use Casimir force to extract power

Fabrizio Pinto published in the Journal of Physics A: Mathematical and Theoretical on Membrane actuation by Casimir force manipulation.

Fabrizio Pinto is part of Interstellar Tech corp has been looking into trying to trying to create an engine by making use of the Casimir force. No Casimir force-based engine cycle could be devised if one assumed a constant Casimir force.

Areas of emphasis are:

1. Casimir force modulation; [now demonstrated by the University of Florida]
2. Repulsive Casimir force; [Prof Ulf Leonhardt and Dr Thomas Philbin 2007 report on theory and now this experimental work]
3. Lateral Casimir force;
4. Casimir force amplification
5. Energy issues in relation to the quantum vacuum.

One can implement a Casimir system engine cycle to transform thermal or optical energy into mechanical or electrical energy.

The Interstellar Tech corp proposal for the Transvacer device. They describe a casimir force-based engine where zero-point energy is transformed into mechanical energy.

Nasa study from 2004 on Casimir force Space Propulsion
A 57 page study of using "Study of Vacuum Energy Physics for Breakthrough Propulsion"

G. Jordan Maclay, Quantum Fields LLC, Wisconsin

Jay Hammer and Rod Clark, MEMS Optical, Inc. Alabama

Michael George, Yeong Kim, and Asit Kir, University of Alabama

4. Gedanken Vacuum Powered Spacecraft (on page 30)

A Gedanken spacecraft is described that is propelled by means of the dynamic Casimir effect, which describes the emission of real photons when a conducting surface is moved in the vacuum with a high acceleration. The maintenance of the required boundary conditions at the moving surface requires the emission of real photons, sometimes described as the excitation of the vacuum. The recoil momentum from the photon exerts a force on the surface, causing an acceleration. If one imagines the moving surface is attached to a spacecraft, then the spacecraft will experience a net acceleration. Thus we have a propellantless spacecraft. However, we do have to provide the energy to operate the vibrating mirror. In principle, it is possible to obtain this power from the quantum vacuum, and this possibility is explored. Unfortunately with the current understanding and materials, the acceleration due to the dynamic Casimir effect is very small, on the edge of measurability. One of the objectives in this paper is to demonstrate that some of the unique properties of the quantum vacuum may be utilized in a gedanken spacecraft. We have demonstrated that it is possible, in principal, to cause a spacecraft to accelerate due to the dissipative force an accelerated mirror experiences when photons are generated from the quantum vacuum.

Further we have shown that one could in principal utilize energy from the vacuum fluctuations to operate such a vibrating mirror assembly. The application of the dynamic Casimir effect and the static Casimir effect may be regarded as a proof of principal, with the hope that the proven feasibility will stimulate more practical approaches exploiting known or as yet unknown features of the quantum vacuum. A model gedanken spacecraft with a single vibrating mirror was proposed which showed a very unimpressive acceleration due to the dynamic Casimir effect of about 3x10−20m/ s2 with a very inefficient conversion of total energy expended into spacecraft kinetic energy. Employing a set of vibrating mirrors to form a parallel plate cavity increases the output by a factor of the finesse of the cavity, 10**10, yielding an acceleration per meter squared of plate area of about 3x10−10m/s**2 and a conversion efficiency of about 10**−16. After 10 years at this acceleration, a one square meter spacecraft would be traveling at 0.1m/s. Although these results are rather unimpressive, it is important to remember this is a proof of the principal, and to not take our conclusions regarding the final velocity in our simplified models too seriously. The choice of numerical parameters is a best guess based on current knowledge and can easily affect the final result by 5 orders of magnitude.


January 06, 2009

Blacklight Power has signed a Second Commercial Deal. This deal is with Farmer's Electric

BlackLight Power (BLP) Inc. today announced its second commercial license agreement with Farmers’ Electric Cooperative, Inc. of New Mexico, (Farmers’ Electric). In a non-exclusive agreement, BLP has licensed Farmers’ Electric to use the BlackLight Process and certain BLP energy technology for the production of thermal or electric power. Farmers’ Electric may produce gross thermal power up to a maximum continuous capacity of 250 MW or convert this thermal power to corresponding electricity.

About Farmers’ Electric Cooperative, Inc. of New Mexico
Formed in 1937, Farmers’ Electric serves rural consumers surrounding Texico, Clovis, and Tucumcari; and the communities of Melrose, Fort Sumner, Santa Rosa, Conchas Dam, House, Grady, San Jon and Logan with over 4,200 miles of energized lines.


Blacklight Power critics charge that the company and its staff are scammers and frauds and their science is bogus.

UPDATE:
IEEE Spectrum has three pages of online review of Blacklight Power.

“I would say without reservation that if Mills were proved right, it would revolutionize physics and solve the world’s energy problems overnight, and he would easily win a Nobel Prize and become a multibillionaire,” says John Connett, a mathematician at the University of Minnesota, in Minneapolis, who’s tracked Mills’s ideas for several years. “But extraordinary claims require extraordinary proof, and at this point it appears to me that the proof side of the equation is very sadly lacking.”

Mills says BlackLight has operated the reactor continuously for two hours and that it’s investigating a new type of fuel that yields 10 times as much energy per weight as the sodium hydroxide–doped Raney nickel. He insists the company has disclosed the experiment in detail in a paper available on its Web site, only retaining “some know-how in order to maintain our technical lead.” He says BlackLight is “open to host validators” and is “willing to supply the fuel under an academic license or commercial license.” Eventually, he contends, others will be able to make the fuel themselves.

Pilot plants projected for mid- to late 2009.


Mills/Blacklight Power frauds or multi-billionaire nobel prize winning world energy solvers ? 2009 and 2010 will tell the tale. 2011 should have a compelling movie based on either outcome.

This site notes that they have $50 million or more in private money backing them. 2009 seems to be when they will be proving that they can generate power on a commerical basis at a revolutionary cost with revolutionary technology. If this is not proved to be true and the critics are correct then only the private money funders will be losing their money and any potential new funders.

Previous Deal and Information
BlackLight Power (BLP) Inc. today announced its first commercial license agreement with Estacado Energy Services, Inc. in New Mexico, a wholly-owned subsidiary of Roosevelt County Electric Cooperative, (Estacado). In a non-exclusive agreement, BLP has licensed Estacado to use the BlackLight Process and certain BLP energy technology for the production of thermal or electric power. Estacado may produce gross thermal power up to a maximum continuous capacity of 250 MW or convert this thermal power to corresponding electricity.

Background
- Blacklight Power has provided information and assistance to a blogger/chemistry professor looking to validate their process

- Venture Beat investigates Blacklight Power

- Rowan University study provides external confirmation of a substantial amount of extra heat from Blacklight Power materials.

- Blacklight Power Claims

The latest expected unit costs for the Blacklight power system compared to current energy technology:



The Blacklight hydrogen production plant diagram

Potential Applications for Blacklight Power Technology
- H2(1/p) Enables laser at wavelengths from visible to soft X-ray
- VUV photolithography (Enables next generation chip)
- Blue Lasers
- Line-of-sight telecom and medical devices
- High voltage metal hydride batteries
- Synthetic thin-film and single crystal diamonds
- Metal hydrides as anticorrosive coatings







Estacado is a wholly-owned subsidiary of Roosevelt County Electric Cooperative, (RCEC) in New Mexico. With over 2,757 miles of energized lines in east central New Mexico, RCEC serves Dora, Elida, Floyd, Arch, Rogers, Milnesand, Causey and Portales.


FURTHER READING
Details of Blacklight Powers patent dispute in the UK.

In upholding both of the examiner's objections, the Hearing Officer identified the question which he had to address to be whether the underlying theory of GUTCQM was true. In doing so, he identified three criteria which he had to consider in determining whether a scientific theory was true, namely whether:

the explanation of the theory is consistent with existing generally accepted theories. If it is not, it should provide a better explanation of physical phenomena then current theories and should be consistent with any accepted theories that it does not displace;

-the theory makes testable predictions, and the experimental evidence shows rival theories to be false and matches the predictions of the new theory, and whether
-the theory is accepted as a valid explanation of physical phenomena by the community of scientists who work in the relevant discipline.

Critically, the hearing officer went on to determine that he must satisfy himself that it was more probable than not that the theory was true. On this basis, the Hearing Officer found that he was not satisfied that the theory was true and therefore the claims in the applications which relied upon the theory were not patentable.

The appeal focused on whether the Hearing Officer had been right in considering the appropriate test to be whether the theory was true on the balance of probabilities. Blacklight contended that the test that should be applied is whether the theory is clearly contrary to well established physical laws. In considering this, the examiner should assess whether the applicant has a reasonable prospect of showing that his theory is a valid one should the patent be litigated in court. In making these arguments, Blacklight accepted that on the material before the Hearing Officer the theory was probably incorrect.


Examiner has an article on Blacklight Power

January 05, 2009

Focus fusion $1.2 million two year nuclear fusion project


Lawrenceville Plasma Physics, the group looking to develop dense plasma focus fusion (focus fusion), has provided details of $1.2 million in funding and the project plan.

This approach to nuclear fusion has been covered before on this site

Lawrenceville Plasma Physics Inc., a small research and development company based in West Orange, NJ, has announced the initiation of a two-year-long experimental project to test the scientific feasibility of Focus Fusion, controlled nuclear fusion using the dense plasma focus (DPF) device and hydrogen-boron fuel. Hydrogen-boron fuel produces almost no neutrons and allows the direct conversion of energy into electricity. The goals of the experiment are first, to confirm the achievement the high temperatures first observed in previous experiments at Texas A&M University; second, to greatly increase the efficiency of energy transfer into the tiny plasmoid where the fusion reactions take place; third, to achieve the high magnetic fields needed for the quantum magnetic field effect which will reduce cooling of the plasma by X-ray emission; and finally, to use hydrogen-boron fuel to demonstrate greater fusion energy production than energy fed into the plasma (positive net energy production).

The experiment will be carried out in an experimental facility in New Jersey using a newly-built dense plasma focus device capable of reaching peak currents of more than 2 MA. This will be the most powerful DPF in North America and the second most powerful in the world. For the millionth of the second that the DPF will be operating during each pulse, its capacitor bank will be supplying about one third as much electricity as all electric generators in the United States.







A small team of three plasma physicists will perform the experiments: Eric Lerner, President of LPP; Dr. XinPei Lu and Dr. Krupakar Murali Subramanian. Mr. Lerner has been involved in the development of Focus Fusion for over 20 years. Dr. Lu is currently Professor of Physics at HuaZhong Univ. of Sci. & Tech., Wuhan, China, where he received his PhD in 2001. He has been working in the field of pulsed plasmas for over 14 years and is the inventor of an atmospheric-pressure cold plasma jet. Dr. Subramanian is currently Senior Research Scientist, AtmoPla Dept., and BTU International Inc., in N. Billerica, Massachusetts. He worked for five years on the advanced-fuel Inertial Electrostatic Confinement device at the University of Wisconsin, Madison, where he received his PhD in 2004 and where he invented new plasma diagnostic instruments.

To help in the design of the capacitor bank, LPP has hired a leading expert in DPF design and experiment, Dr. John Thompson. Dr. Thompson has worked for over twenty years with Maxwell Laboratories and Alameda Applied Sciences Corporation to develop pulsed power devices, including DPFs and diamond switches.

The $1.2 million for the project has been provided by a $500,000 investment from The Abell Foundation, Inc, of Baltimore, Maryland, and by additional investments from a small number of individuals.

The basic technology of LPP’s approach is covered by a patent application, which was allowed in full by the US Patent Office in November. LPP expects the patent to be issued shortly.


FURTHER READING
Technical details

The dense plasma focus device consists of two cylindrical copper or beryllium electrodes nested inside each other. The outer electrode is generally no more than 6-7 inches in diameter and a foot long. The electrodes are enclosed in a vacuum chamber with a low pressure gas filling the space between them.

A pulse of electricity from a capacitor bank (an energy storage device) is discharged across the electrodes. For a few millionths of a second, an intense current flows from the outer to the inner electrode through the gas. This current starts to heat the gas and creates an intense magnetic field. Guided by its own magnetic field, the current forms itself into a thin sheath of tiny filaments; little whirlwinds of hot, electrically-conducting gas called plasma. A picture of these plasma filaments is shown below along with a schematic drawing.

This sheath travels to the end of the inner electrode where the magnetic fields produced by the currents pinch and twist the plasma into a tiny, dense ball only a few thousandths of an inch across called a plasmoid. All of this happens without being guided by external magnets.

The magnetic fields very quickly collapse, and these changing magnetic fields induce an electric field which causes a beam of electrons to flow in one direction and a beam of ions (atoms that have lost electrons) in the other. The electron beam heats the plasmoid to extremely high temperatures, the equivalent of billions of degrees C (particles energies of 100 keV or more).

The collisions of the electrons with the ions generate a short pulse of highly-intense X-rays. If the device is being used to generate X-rays for our X-ray source project, conditions such as electrode sizes and shapes and gas fill pressure can be used to maximize X-ray output.

If the device is being used to produce fusion energy, other conditions can minimize X-ray production, which cools the plasma. Instead, energy can be transferred from the electrons to the ions using the magnetic field effect. Collisions of the ions with each other cause fusion reactions, which add more energy to the plasmoid. So in the end, the ion beam contain more energy than was input by the original electric current. (The energy of the electron beam is dissipated inside the plasmoid to heat it.) This happens even though the plasmoid only lasts 10 ns (billionths of a second) or so, because of the very high density in the plasmoid, which is close to solid density, makes collisions very likely and they occur extremely rapidly.

The ion beam of charged particles is directed into a decelerator which acts like a particle accelerator in reverse. Instead of using electricity to accelerate charged particles, they decelerate charged particles and generate electricity. Some of this electricity is recycled to power the next fusion pulse while the excess (net) energy is the electricity produced by the fusion power plant. Some of the X-ray energy produced by the plasmoid can also be directly converted to electricity through the photoelectric effect (like solar panels).

The DPF has been in existence since 1964, and many experimental groups around the world have worked with it. LPP’s unique theoretical approach, however, is the only one that has been able to fully explain how the DPF works, and thus exploit its full capabilities.


Micronutrient Deficiency Problems for Developed and Developing Countries

Developing countries have a problem of insufficient iodine which degrades intelligence by 10-15 percent. This is mostly solved by adding iodine to salt in the developed world.

Most people are aware of the importance of getting enough calcium, which remains a widespread problem. Common micronutrient deficiencies are zinc, magnesium, iron, folic acid, and iodine. A swiss study also indicates the problem.

If everyone had optimal levels of micronutrients then the IQ of over half of the worlds population would be increased by up to 20 IQ points. (Enough Iodine and Zinc.) Energy levels, productivity and health would also be improved. Also, preventing brain damage from pollution like lead would also help. Increased IQ provides economic benefits and reduced crime levels. Other kinds of drugs and materials are also likely to be found to have intelligence, health and productivity enhancement. There are drugs and things in food that help with concentration and memory. Significant human cognitive enhancement is not a far out Transhumanist concept. Even more is possible with cybernetic approaches, brain computer interfaces, stem cells and gene therapy.

People in the developed world still have widespread deficiency of magnesium.

Magnesium is a must. The diets of all Americans are likely to be deficient........Even a mild deficiency causes sensitiveness to noise, nervousness, irritability, mental depression, confusion, twitching, trembling, apprehension, insomnia, muscle weakness and cramps in the toes, feet, legs, or fingers.

Magnesium (Mg) is a trace mineral that is known to be required for several hundred different functions in the body. A significant portion of the symptoms of many chronic disorders are identical to symptoms of magnesium deficiency. Studies show many people in the U.S. today do not consume the daily recommended amounts of Mg. A lack of this important nutrient may be a major factor in many common health problems in industrialized countries. Common conditions such as mitral valve prolapse, migraines, attention deficit disorder, fibromyalgia, asthma and allergies have all been linked to a Mg deficiency.


Diet can be modified to get more Magnesium. However, studies have shown that food alone may not be enough to achieve optimal micronutrient levels.

The first step, of course, is to basically just eat more magnesium rich foods, especially beans, nuts and vegetables. Vegetables are especially good if you are watching your weight because you can ingest a lot of magnesium for a relatively small number of calories. Calcium is a magnesium antagonist. As such, drinking too much milk or eating too many other calcium rich foods in relation to Mg containing foods may lower magnesium levels.

Supplementation problem: Magnesium is an alkaline mineral and a common ingredients in antacids. We've noticed in my family that taking magnesium supplements for more than a day or two can sometimes cause cramping and diarrhea.


Food alone in all 20 subjects did not meet the minimal Recommended Daily Allowances (RDA) micronutrient requirements for preventing nutrient-deficiency diseases. The moreactive the person, the greater the need to employ avariety of balanced micronutrient-enriched foods including micronutrient supplementation as apreventative protocol for preventing these observed deficiencies.


Bruce Ames has developed a bar that would be able to easily enable people to achieve proper micronutrient levels.

Bruce has developed a low calorie bar to top up micronutrient levels more effectively than vitamin pills and other supplements. It should have some limited commercial availability in 2009.



Folic acid deficiency can lead to neural tube closure defects (NTDs) and anemia.

Zinc deficiency affects immune function, contributing to as many as 800,000 child deaths per year.

In spite of the proven benefits of adequate zinc nutrition, approximately 2 billion people still remain at risk of zinc deficiency.

The most vulnerable population groups are infants, young children, and pregnant and lactating women because of their elevated requirements for this essential nutrient.


The World Health Organization (WHO) has categorized iron deficiency as one of the top ten most serious health problems in the modern world.

Iron deficiency anemia (IDA):

-Impairs the mental development of over 40% of the developing world's infants and reduces their chances of attending or finishing primary school

-Decreases the health and energy of approximately 500 million women and leads to approximately 50,000 deaths in childbirth each year

-Is complex because it requires increased iron intake at critical stages of the life- cycle - before and during pregnancy and throughout early childhood

-Various tests of cognitive and psychomotor skills associate lack of iron during infancy and early childhood with significant levels of disadvantage, affecting IQ scores by as much as 5 to 7 points.


Iodine deficiency is the leading preventable cause of brain damage and it can significantly lower the IQ of whole populations.

Cognitive enhancement
This site has looked at cognitive enhancement before.

Many times along with performance enhancement.

FURTHER READING
There was a completed phase 2 clinical trial of multiple micronutrient biscuits in Vietnam

Advertising

Trading Futures
 
Nano Technology
 
Netbook     Technology News
 
Computer Software
   
Future Predictions

Thank You



BMW Gina car: a real time shape shifting fabric and wire frame fully drivable concept car





In the video, it can be seen that the front headlights can wink open and closed.
The rear lights can shine through the material. The front body can also be seen to reshape by shifting the wires in the frame. The aerodynamics could be modified in real time as needed.

The advantage of such a car would be far less material used for construction and thus more fuel efficient. It would also have a reduced supply chain.





FURTHER READING
Other light weight car concepts which might or might not be developed are inflatable all electric cars.