Pages

March 15, 2008

March 14, 2008

Deaths per TWh for all energy sources: Rooftop solar power is actually more dangerous than Chernobyl







Comparing deaths/TWh for all energy sources



Energy Source              Death Rate (deaths per TWh)

Coal – world average               161 (26% of world energy, 50% of electricity)
Coal – China                       278
Coal – USA                         15
Oil                                36  (36% of world energy)
Natural Gas                         4  (21% of world energy)
Biofuel/Biomass                    12
Peat                               12
Solar (rooftop)                     0.44 (less than 0.1% of world energy)
Wind                                0.15 (less than 1% of world energy)
Hydro                               0.10 (europe death rate, 2.2% of world energy)
Hydro - world including Banqiao)    1.4 (about 2500 TWh/yr and 171,000 Banqiao dead)
Nuclear                             0.04 (5.9% of world energy)


Update: I have written a fairly comprehensive article about steps to lower deaths per terawatt hour. Primary focus on pollution mitigation. Air pollution causes 3.1 million deaths per year


A superior form of solar power would be the Coolearth concentrated solar power system which would be installed on the ground or wires over a ground installation.



Rooftop solar is several times more dangerous than nuclear power and wind power. It is still much safer than coal and oil, because those have a lot of air pollution deaths.

Rooftop solar can be safer [0.44 up to 0.83 death per twh each year). If the rooftop solar is part of the shingle so you do not put the roof up more than once and do not increase maintenance then that is ok too. Or if you had a robotic system of installation.

World average for coal is about 161 deaths per TWh.
In the USA about 30,000 deaths/year from coal pollution from 2000 TWh.
15 deaths per TWh.
In China about 500,000 deaths/year from coal pollution from 1800 TWh.
278 deaths per TWh.



Wind power proponent and author Paul Gipe estimated in Wind Energy Comes of Age that the mortality rate for wind power from 1980–1994 was 0.4 deaths per terawatt-hour. Paul Gipe's estimate as of end 2000 was 0.15 deaths per TWh, a decline attributed to greater total cumulative generation.

Hydroelectric power was found to to have a fatality rate of 0.10 per TWh (883 fatalities for every TW·yr) in the period 1969–1996

Nuclear power is about 0.04 deaths/TWh.

The ExternE calculation of death/TWh from different energy sources (not including global warming effects and is the average for European nations).











This draws on data from 4290 energy-related accidents, 1943 of them classified as severe, and compares different energy sources. It considers over 15,000 fatalities related to oil, over 8000 related to coal and 5000 from hydro.


Deaths statistics from the fuel chain for coal and nuclear

Higher level of deaths from coal in public health would be related to the increased deaths from particulates. The deaths totals are more from coal occupation are mining.

The World Health Organization and other sources attribute about 1 million deaths/year to coal air pollution. Coal generates about 6200 TWh out of the world total of 15500 TWh of electricity. This would be 161 deaths per TWh.
In the USA about 30,000 deaths/year from coal pollution from 2000 TWh. 15 deaths per TWh.
In China about 500,000 deaths/year from coal pollution from 1800 TWh. 278 deaths per TWh.

The construction of existing 1970-vintage U.S. nuclear power plants required 40 metric tons (MT) of steel and 190 cubic meters (m3) of concrete per average megawatt of electricity (MW(e)) generating capacity. For comparison, a typical wind energy system operating with 6.5 meters-per-second average wind speed requires construction inputs of 460 MT of steel and 870 m**3 of concrete per average MW(e). Coal uses 98 MT of steel and 160 m**3 of concrete per average MW(e); & natural-gas combined cycle plants use 3.3 MT steel and 27 m**3 concrete.

Wind power generation was 95 GW at the end of 2007.
1 MW produces 3,066 MWh if 35% efficient.
20 GW in Germany generated 30 TWh in 2006.
95GW would be generating about 150TWh.
95000GW would have taken 43.7 million tons of steel and 82.7 million tons of concrete. 3% of one year of global steel production. 4% of one year of the world’s concrete production. Half of one year’s production in the US for steel. About 15 deaths if corresponded to half of one years metal/nonmetal mining fatalities. 0.1 deaths per TWh. If the metal and concrete had come from China about 2700 metal/nonmetal mining deaths per year for 5 times the amount of steel. 270 deaths to get the metal for the wind turbines. 1.9 deaths per TWh. These construction related deaths are amortized over the life of the wind turbines of 30 years. Other wind power deaths need to factor in dangers associated with working with very tall structures (50 stories tall) and with deep water work associated with building and anchoring offshore.

Wind power proponent and author Paul Gipe estimated in Wind Energy Comes of Age that the mortality rate for wind power from 1980–1994 was 0.4 deaths per terawatt-hour. Paul Gipe's estimate as of end 2000 was 0.15 deaths per TWh, a decline attributed to greater total cumulative generation. By comparison, hydroelectric power was found to to have a fatality rate of 0.10 per TWh (883 fatalities for every TW·yr) in the period 1969–1996. This includes the Banqiao Dam collapse in 1975 that killed thousands.



Metal/Nonmetal fatalities in the USA (iron and concrete components mainly)

(3.1 GWp generated 2TWh in Germany for solar)

Coal and fossil fuel deaths usually do not include deaths caused during transportation. The more trucking and rail transport is used then the more deaths there are. The transportation deaths are a larger component of the deaths in the USA than direct industry deaths. Moving 1.2 billion tons of coal takes up 40% of the freight rail traffic and a few percent of the trucking in the USA.

Uranium mining is a lot safer because insitu leaching (the main method of uranium mining) involves flushing acid down pipes. No workers are digging underground anymore. Only about 60,000 tons of uranium are needed each year so that is 200 times less material being moved than for coal plants.

But what about Chernobyl ?
The World Health Organization study in 2005 indicated that 50 people died to that point as a direct result of Chernobyl. 4000 people may eventually die earlier as a result of Chernobyl, but those deaths would be more than 20 years after the fact and the cause and effect becomes more tenuous.

He explains that there have been 4000 cases of thyroid cancer, mainly in children, but that except for nine deaths, all of them have recovered. "Otherwise, the team of international experts found no evidence for any increases in the incidence of leukemia and cancer among affected residents."


Averaging about 2100 TWh from 1985-2005 or a total of 42,000 TWh. So those 50 deaths would be 0.0012 deaths/TWh. If those possible 4000 deaths occur over the next 25 years, then with 2800 TWh being assumed average for 2005 through 2030, then it would be 4000 deaths over 112,000 TWh generated over 45 years or 0.037 deaths/TWh. There are no reactors in existence that are as unsafe as the Chernobyl reactor was. Even the eight of that type that exist have containment domes and operate with lower void co-efficients.

The safety issues with Rooftop solar installations
Those who talk about PV solar power (millions of roofs) need to consider roof worker safety. About 1000 construction fatalities per year in the US alone. 33% from working at heights.

Falls are the leading cause of fatalities in the construction industry. An average of 362 fatal falls occurred each year from 1995 to 1999, with the trend on the increase. 269 deaths (combined falls from ladders and roofs in 2002). UPDATE: Based on a more detailed analysis of the fatal fall statistic reports I would now estimate the fatal falls that would match the solar panel roof installations as 100-150. Only 30-40 are classified as being a professional roofer but deaths for laborer or general construction worker or a private individual count as deaths.

Roofing is the 6th most dangerous job. Roofers had a fatality rate in 2002 of 37 per 100,000 workers.

In 2001, there were 107 million homes in the United States; of those, 73.7 million were single-family homes. Roughly 5 million new homes are built each year and old roofs need to significant work or replacement every 20 years. So 9-10 million roofing jobs in the US alone. In 2007, Solar power was at 12.4 GW or about 12.6 TWh. The 2006 figure for Germany PV was only 1TWh from about 1.5GW from $4 billion/yr. The German rate of solar power generation would mean 12.4GW would generate 8TWh. 2.8GW generates 2 TWh for Germany, assuming other places are 50% sunnier on average, then the 9.6GW would generate 10.6 TWh.

$4 billion is about the cost of one of the new 1.5 GW nuclear power plants, which would generate 12 TWh/year. Nuclear power plants (104) rated at a total 100GW generated 800 Twh in 2007.

The world total was from about 1.5 million solar roofed homes. 30% of the solar power was from roof installed units. 1/6th of the 9 million roofing job accidents would be about 50 deaths from installing 1.5 million roofs if other countries had similar to US safety. The amount of roof installations is increasing as a percentage. 4 TWh from roofs PV. So 12.5 deaths per TWh from solar roof installations. Assuming 15 years as the average functional life or time until major maintenance or upgrade is required. The average yearly deaths from rooftop solar is 0.83/TWh. Those who want a lower bound estimate can double the life of the solar panels (0.44deaths/TWh). This is worse than the occupational safety issues associated with coal and nuclear power. (see table below). 12 to 25 times less safe than the projected upper bound end effect of Chernobyl (from WHO figures). The fifty actual deaths from roof installation accidents for 1.5 million roof installations is equal to the actual deaths experienced so far from Chernobyl. If all 80 million residential roofs in the USA had solar power installed then one would expect 9 times the annual roofing deaths of 300 people or 2700 people (roofers to die). This would generate about 240 TWh of power each year. (30% of the power generated from nuclear power in the USA). 90 people per year over an optimistic life of 30 years for the panels not including maintenance or any electrical shock incidents.

Maintenance and Functional life of solar panels

[Q26. Do they require any maintenance?
A26: Only an occasional wipe to ensure optimal performance of the solar panel.]

15. How long will the panels last?
Generally, systems last 20-30 years since the waterproof seals on the panels tend to deteriorate over time.
16. If I move home, can I take the solar panels with me?
You could take your solar power system down and re-install it at your new house provided the roof of the new house is suitable. Or, you could include it in the selling price of your house. If your house is in a remote area and the solar power system is the sole source of power, the purchaser of your house would be wise to make sure the solar power system is included in the price, or they’ll be left without electricity.
[Generally hail resistant but a storm big enough to damage a regular roof would also damage a rooftop solar panel system.]

http://www.gepower.com/prod_serv/products/solar/en/faqs/resid_sys.htm#faq24
http://www.gepower.com/prod_serv/products/solar/en/faqs/resid_sys.htm#faq28http://www.heatmyhome.co.uk/pv-solar-panels.htm

The 10 most dangerous jobs
Occupation     Fatalities per 100,000 
Timber cutters              117.8
Fishers                       71.1
Pilots and navigators       69.8
Structural metal workers      58.2
Drivers-sales workers       37.9
Roofers                       37
Electrical power installers   32.5  [also, solar power related]
Farm occupations       28
Construction laborers       27.7
Truck drivers               25

Source: Bureau of Labor Statistics; survey of occupations with minimum 30 fatalities and 45,000 workers in 2002

Conclusion:
Nothing is perfectly safe. Chasing perfection can cause us to ignore just improving and trading worse for a lot better. Non-roof installations of solar is safer than roof installation. Nuclear, wind, non-roof solar and hydro are a lot safer than coal and oil. Natural gas is safer but not as much as nuclear and those others. The focus needs to be on getting rid of the most dangerous energy sources which are coal and oil first. Then after that decades long project is done to look at the other energy sources. Safety and improvements for all energy sources should be made as we go.

UPDATE:
Rooftop solar is still a hundred times safer than coal and oil power because of air pollution deaths. Other ways to make solar power safer:
1. Increase safety for all rooftop work (can reduce deaths by half or more)
2. Rooftop solar tiles installed on new buildings might not have any more incremental deaths as opposed to panels that are separate from the roof tiles or systems installed that replace roof tiles before they would normally be replaced.
3. Create some new installation system where people stay on the ground using some forklift or crane to raise and place a solar power system onto a roof. Have to ensure that the heavy machinery system is safer than the roofing process being replaced.

Some responders online are in denial that people who work on a roof can fall off regardless of the reason they went up there. If I go up there to replace roofing tiles or go up there to install solar panels, the risk of falling is pretty much the same especially when the number of times being compared heads to large numbers like millions of times for each. As I noted in the comments, statistics show that 70% of fatal construction falls occur at height of 3 stories or less.

Some have also claimed that someone who went up onto a roof to install a solar panel but then fell is not a death associated with solar power. Similarly then if someone is killed in a coal mine then that is not a coal power death because the coal was not in the power plant yet or they might have some other reason for being underground and would have been crushed anyway.

FURTHER READING
189 page pdf from the 1997 Externe analysis of energy sources and fuel cycles.

RELATED NEWS
Canada is increasing the planned number of nuclear reactors in Alberta to 4 plants generating 4 GW. The plan is to complete them by 2017.

Southern California Edison (SCE) plans to spend $875 million over the next five years putting solar panels onto commercial roofs to generate 250 megawatts of solar capacity. The panels will be on 65 million square feet of roof.

San Jose has a 15 year green vision to install 100,000 solar power roofs.

San Jose was chosen a Solar America City by the U.S. Department of Energy and will share $2.4 million in funding with 11 other cities. Other cities designated as Solar America Cities include Sacramento, Santa Rosa, Seattle, Wash.; Houston, Texas; Knoxville, Tenn.; Milwaukee, Wis.; Minneapolis & St. Paul, Minn.; Orlando, Fla.; Philadelphia, Penn.; and San Antonio, Texas.

Severin Borenstein, director of the U.C. Energy Institute and a professor at the University of California, Berkeley's business school, called existing technology "a loser" in a research paper. "We are throwing money away by installing the current solar PV technology," he said.

Borenstein calls for more state and federal money to be spent on research into better technology, rather than on subsidies for residential solar power systems. In his analysis, Borenstein found that a typical PV system costs between $86,000 and $91,000 to install, while the value of its power over its lifetime ranges from $19,000 to $51,000. Even assuming a 5 percent annual increase in electric costs and a 1 percent interest rate, the cost of a PV system is 80 percent greater than the value of the electricity it will produce. In his paper, Borenstein also factored in the value of greenhouse gas reductions into his calculations, and found that at current prices the PV technology still doesn't deliver.


California's Million Solar Roofs Plan, signed into law in 2006, which will provide 3,000 megawatts of additional clean energy and reduce the output of greenhouse gases by 3 million tons. The 2.9-billion-dollar incentive plan for homeowners and building owners who install solar electric systems will lead to 1 million solar roofs in California by the year 2018.

FURTHER READING
Sample solar power installation instructions

More rooftop solar panel installation instructions

Solar thermal panels for hot water heating are typically 36-75kg in weight per panel.

Solar PV panels are currently about 40-60 pounds (20-30kg).


US energy use by source

Advertising

Trading Futures
 
Nano Technology
 
Netbook     Technology News
 
Computer Software
   
Future Predictions


Artificial Intelligence ? You're soaking in it.

You're soaking in it.

This phrase was popularized by a television commercial campaign for Palmolive dish washing detergent. Madge, a manicurist, would comment on the dry, rough appearance of her client's skin as she worked on one hand while the other soaked in a bowl of light green liquid. The client would ask her advice; Madge would recommend Palmolive; the client would act surprised (after all, how could a dish washing detergent affect one's skin? Preposterous.). Then Madge would inform the client about the liquid in the bowl: "You're soaking in it," she'd say, in a very matter-of-fact tone. The shocked client would immediately remove her hand from the bowl, and Madge would guide it back down, assuring her that everything was fine: "Palmolive softens hands as you do dishes."


Program trading (using classic artificial intelligence techniques) is closing in on controlling half of all financial transactions in the world and 80% in the USA.

A third of all EU and US stock trades in 2006 were driven by automatic programs, or algorithms, according to Boston-based consulting firm Aite Group LLC. By 2010, that figure will reach 50 percent, according to Aite.

In 2006 at the London Stock Exchange, over 40% of all orders were entered by algo traders, with 60% predicted for 2007. American markets and equity markets generally have a higher proportion of algo trades than other markets, and estimates for 2008 range as high as an 80% proportion in some markets.

University endowments and corporate pension funds are distributed into Hedge Funds (20%) and stock, bond and commodity funds which are mostly algorithmically controlled. Particularly US markets with 80% program trading.

University endowment investments are described in this pdf

Some people like to mock the idea of Artificial Intelligence and Artificial General intelligence as "robot gods". The generally superior than human generated returns from program trading are helping to provide money for paycheck, pension and department budgets of those who mock AI and mock the idea that better AI is coming or that AI will have more and more influence on society.

Reality and facts would just get in the way of Dale's worldview.

Ray Kurzweil is on the vanguard of using even more advanced AI to run his own hedge fund. Part of $30 billion/year invested in hardware and software for financial trading and spending on improving the power and capabilities of those AI systems. As if better AI won't be adopted in this financial intelligence arms race.

A breakthrough that could happen this year [October, 2008] is a supercomputer able to model what people believe could pass a form of the Turing test.

Google is using artificial intelligence techniques to provide better searches and to provide better matching of advertising with search results.

It's pretty clear from what [Google co-founders] Larry Page and Sergey Brin have said in interviews that
Google sees search as essentially a basic form of artificial intelligence. A year ago, Google executives said the company had achieved just 5% of its complete vision of search. That means, in order to provide the best possible results, Google's search engine will eventually have to know what people are thinking, how to interpret language, even the way users' brains operate.

Google has lots of experts in artificial intelligence working on these problems, largely from an academic perspective. But from a business perspective, artificial intelligence's effects on search results or advertising would mean huge amounts of money.


Some of the most powerful AI will be trying to achieve the goal of anticipating what you want to buy when you want to buy it.

FURTHER READING: Many competing options to make computers millions of times more powerful than today.
Proper framing of the transhumanist debate

Promising new approach to molecular computing.

Brain simulation progress.

Tensilica configurable processors could make affordable petaflop and exaflop computers

New nanoscale metamaterial architecture for enabling an all optical computer.

More autonomous robots using better 3D freeze frame visual systems with LIDAR

The struggle over high risk high payoff research.

Quantum annealing can be millions of times faster than classical computers.

Predictions on artificial general intelligence.

Hardware for artificial intelligence.

Cognitive enhancement methods

March 13, 2008

Carnival of Space Week 45

Carnival of Space week 45 is up at missyfrye.net

Centauri Dreams talks about replication machines and space colonization

Centauri Dreams was examining the issue based on another article at George Dvorsky's sentient developments on seven ways to control the galaxy with self replicating probes.

I recently took a look at these visions and came up with a Von Neumann probe taxonomy. I came up with 7 basic spacecraft functions:

1. Exploration
2. Communication
3. Working
4. Colonization
5. Uplifting
6. Berserking
7. Policing

March 12, 2008

Large reserves of Thorium

Charles Barton reports that the US Geological Service will be announcing huge new reserves of Thorium in the United States.

Charles also discusses the needs and issues of a comprehensive national energy policy.

This information is crossposted at thoriumenergy

"Thorium Power, Inc. has told me that they already have the technology to “switch over” from uranium to thorium more than 60% of the reactors in use today in the world."

"They said that a switched over or built from the ground up thorium powered reactor has for the “blanket” a total of three times the life of a uranium powered reactor. This would mean that the savings during the first fuel cycles will pay for the changeover in the case of a “retrofit.” The core can be used to burn fissionable grade plutonium to non weapons grade material while the blanket will be made from thorium and uranium-233, not 238, so that no weapons grade plutonium-239 can be produced in the reactor."


RELATED NEWS
U.S. Department of Energy's Idaho National Laboratory, in partnership with three other science and engineering powerhouses, reached a major domestic milestone relating to nuclear fuel performance on March 8.

The research to improve the performance of coated-particle nuclear fuel met an important milestone by reaching a burnup of 9 percent without any fuel failure. The team has now set its sights on reaching its next major milestone -- achievement of a 12-14 percent burnup* expected later this calendar year. [2008]

Burnup rates in most commercial nuclear reactors is 0.7-2% of the fuel that is put into the reactors.

TRISO burnup

Maximum capsule burn-up > 18 % FIMA (172.8 GWd/t) Fuel Stack
134.5 GWd/t 14%

GWd/t is Gigawatts per day per ton of fuel. 14% burnup is about 134.5 GW days per ton of fuel.

Most current reactors are in the 20-50 GWd/t range.

India has finished the design of thorium nuclear reactor and expect to start building it this year (2008).

144 page pdf on using thorium in nuclear reactors. Year 2000 study from the IAEA

Technical publications related to Thorium Powers fuel technology.

Wimax looks to breakout big in India

The Wimax communication technology could be making a breakout in India as reported by Businessweek.

Sprint will be rolling out a WiMax network in Washington next month (April 2008), and in other U.S. cities next year {2009). Until now the most advanced use of WiMax has been in Japan and Korea, where Japanese carrier KDDI and Korea Telecom offer extensive WiMax networks. However the Japanese and Korean services are not available nationwide—KDDI will have its major rollout only in 2009—and most people use them as supplements to the wired services.


Tata Communications has been working on setting this up for a couple of years, and successfully completed field trials last December. It has used the technology from Telsima, a Sunnyvale (Calif.) maker of WiMax base-stations and the leading WiMax tech provider in the world. For now, the technology will be restricted to fixed wireless, but Tata plans to make it mobile by midyear. The company has invested about $100 million in the project, which will increase to $500 million over the next four years as it begins to near its goal of having 50 million subscribers in India.

The total Indian subscriber base [of fixed line communication] is just 3.2 million and there is no clear market leader. But with the WiMax rollout Tata can gain a leadership position and add "a few thousand subscribers a day," says Alok Sharma, chief executive of Telsima.

Electric and hybrid vehicles from Volvo, Honda, Volkswagon, Zap cars and others


The Volvo ReCharge Concept can be driven approximately 62 miles on battery power alone before the car’s four-cylinder 1.6 Flexifuel engine takes over to power the car and recharge the battery. As the vast majority of us drive less than 60 miles a day, the ReCharge is effectively a permanent electric car with an acceleration figure of 0-62mph in 9 seconds and a top speed of 100mph. It could get about 124 mpg.

The Volvo ReCharge Concept shown in Detroit combines a number of the latest technological innovations into a so-called "series hybrid" where there is no mechanical connection between the engine and the wheels.

* Four electric motors, one at each wheel, provide independent traction power.

The 5.6% losses from the drive train are saved.

The advantage of electric motors in wheel systems are that it reduces the power necessary to propel the car by half compared by a geared traction motor thanks to the reduction of friction losses/mechanical efficiency.

* The battery pack integrated into the luggage compartment uses lithium-polymer battery technology. The batteries are intended to have a useful life beyond that of the car itself.
* Four-cylinder 1.6-litre turbo diesel engine (109 hp) drives an advanced generator that efficiently powers the wheel motors when the battery is depleted.

More on the Volvo ReCharge concept car.



The Volvo ReCharge


The diesel hybrid Volkswagon Golf

the hybrid Golf will get 69.9 mpg and emit 90 g/km of carbon dioxide. An earlier report by Britain's Auto Express said 89 g/km, but either way that's less than the 104 g/km emitted by the Prius and 116 emitted by the Honda Civic Hybrid.

VW says it's just a concept at this point, but Auto Bild says it is "more than a concept car" and Auto Express flat-out says "the first hybrid Golfs are expected here (meaning Britain) late next year [2009]."


Honda is planning to show new dedicated hybrid cars in Sept 2008 at the Paris autoshow. The company plans on pricing the new car aggressively, somewhere in the range of €16,000 and €20,000. The Prius currently runs about €24,000 in Europe. The new dedicated hybrid will use a new version of Honda's Integrated Motor Assist system.

Honda will have one hybrid based on the CR-Z concept car.

Honda CR-Z hybrid car

The Paris Honda hybrid car will be newly built from the ground up and will be smaller than a Prius.

Zap is in collaboration with Hybrid Plus, Zap's kit will convert hybrid Priuses and Highlanders into what tests predict will be 120 MPGe (city) and 90 MPGe (highway) vehicles. Depending on the vehicle, the kits cost between $24,000 and $36,000.

Zap has also announced the Zap Alias and the Zap-X cars. Zap has not been great at actually delivering on impressive press releases.





Transparent Society and privacy debate

David Brin has his defense of the Transparent Society up at Wired.

This was in response to a critque by Bruce Schneier that centered around unequal power.

My tiny involvement was emailing David Brin to make him aware of the Schneier article. A common problem with attacks on the 1997 book "The Transparent Society" is that very few people actually read the book and make assumptions based upon the title or short excerpts.



RELATED NEWS
Terahertz radiation cameras that can see through clothing at distances of up to 80 feet.

Following up my prior coverage of an array of cameras.

A wide-angle camera that will be able to monitor large areas through high-resolution images taken from a satellite or an airborne craft. Flying at an altitude of 15,000 feet, a developmental version of the camera can see a 21-kilometer diameter area with a resolution of 0.3 meters. As a comparison, most Google Earth imagery is 1 meter.

Pollock said the camera could have far-reaching implications for the military, crime prevention and enforcement as well as traffic analysis and emergency response support. The giga-pixel camera will fit in a one-meter cube, could be flown on any type of vehicle – airplanes, helicopters, blimps or unmanned aerial vehicles.

Researchers at UAHuntsville stepped in to configure an array of light sensitive chips - each one recording small parts of a larger image - and place them at the focal plane of a large multiple-lens system. The system has the structure of a common kitchen utensil, a colander. The camera would have one giga-pixel resolution, and be able to record images at five frames per second.

ArguSight, an Illinois-based company, has signed a licensing agreement with the university and seeking venture capital to bring the product to the commercial marketplace. CEO Stuart Claggett compares the product to a popular TV product.

"The complete camera system is like a ‘TIVO’ in the sky," he said. "It captures high-quality imagery and records all the data. A user can request numerous high-definition video windows of live data in real-time or you can review all of the video on demand on the ground when the aircraft lands."


There are chips that can capture 111 megapixels from 2006. 270 such higher resolution chips would allow for 20+ gigapixel images.

Terapixel imagse can be stiched together

Promising new approach to molecular computing


The image demonstrates the design of an artificial brain built using a nano-brain reported in this issue of PNAS. Several molecular nano-brain are arranged in a way to work as powerful as our central nervous system. Numerical digits and alphabets float across the architecture demonstrating a matrix generated during a real-time operation similar to the Hollywood blockbuster The Matrix. Credit: Arindam Bandyopadhyay

A powerful new molecular computing device and architecture is making progress. Hat tip: Center for Responsible nanotechnology This looks like a promising approach to radically more powerful computers and a possible pathway to very interesting and powerful molecular devices, machines and factories. The researchers are predicting within 18 months to have 1024 machines working together. They may also be working with Nanoink (maker of dip lithography arrays) for the input and output to the devices. A 2 inch sphere of the devices would equal the computing power of the human brain.

The device can simultaneously carry out 16 times more operations than a normal computer transistor. Researchers suggest the invention might eventually prove able to perform roughly 1,000 times more operations than a transistor.

This machine could not only serve as the foundation of a powerful computer, but also serve as the controlling element of complex gadgets such as microscopic doctors or factories, scientists added.

The device is made of a compound known as duroquinone. This molecule resembles a hexagonal plate with four cones linked to it, "like a small car," explained researcher Anirban Bandyopadhyay, an artificial intelligence and molecular electronics scientist at the National Institute for Materials Science at Tsukuba in Japan.





Bandyopadhyay and his colleagues revealed they could hook up eight other such "molecular machines" to their invention, working together as if they were part of a miniature factory.

Bandyopadhyay added they could expand their device from a two-dimensional ring of 16 duroquinones around the center to a three-dimensional sphere of 1,024 duroquinones. This means it could perform 1,024 instructions at once, for 4**1024 different outcomes — a number larger than a 1 with 1,000 zeroes after it. They would control the molecule at the center of the sphere by manipulating "handles" sticking out from the core.

"We are definitely going to 3-D from 2-D immediately," Bandyopadhyay said.


FURTHER READING
The abstract of the paper: A 16-bit parallel processing in a molecular assembly
A machine assembly consisting of 17 identical molecules of 2,3,5,6-tetramethyl-1–4-benzoquinone (DRQ) executes 16 instructions at a time. A single DRQ is positioned at the center of a circular ring formed by 16 other DRQs, controlling their operation in parallel through hydrogen-bond channels. Each molecule is a logic machine and generates four instructions by rotating its alkyl groups. A single instruction executed by a scanning tunneling microscope tip on the central molecule can change decisions of 16 machines simultaneously, in four billion (4**16) ways. This parallel communication represents a significant conceptual advance relative to today's fastest processors, which execute only one instruction at a time.


[multilevel logic | parallel communication | self-assembly]


researchers in Japan say they have taken a big step toward that nano goal by creating the first molecular machine that can do parallel processing.

Using electrical pulses from the tip of a scanning tunneling microscope, the researchers could flip the control molecule to any one of four configurations, or states. Those flips, in turn, could change the states of the other 16 molecules - just as, say, knocking down one domino can simultaneously set off several chains of falling dominoes.

Mark Ratner, a chemist at Northwestern University who specializes in nanotechnology, said the newly published research represented a significant step toward molecular-scale computers as well as molecular-scale medicine. "People have been talking about both these things for a long time," Ratner told me. "People have even thought about putting these two things together. ... But this is quite pretty because [the researchers] actually use all of the constituents, and that's really neat."

"Is it useful tomorrow? No," he said.

One of the biggest conceptual hurdles has to do with the input/output device: Although the assemblies themselves are at the molecular scale, the scanning tunneling microscope is a big piece of equipment. It wouldn't be practical to use those microscopes to read out the result of a nanocomputer, or harvest the chemicals produced by nanofactories.

Bandyopadhyay said other control methods would be developed for working devices - perhaps optical readers for the nanocomputers, or chemical triggers for the medical nanochips. Ratner said several companies, including an outfit called NanoInk, were working on technologies that might work. [Nanoink created the the dip pen lithography arrays (tens of thousands and million AFMs working in parallel.]

In the meantime, Bandyopadhyay is working to ramp up his molecular machines from two-dimensional arrays to three-dimensional structures. "Within one and a half years we will have 1,024 machines connected," he told me.

Theoretically, the technology could allow for the development of a super-duper information processor contained in a sphere less than 2 inches in diameter, Bandyopadhyay said.

"That will contain the equal amount of components and connectivity that is required inside our brain," he told me.


Physorg also has coverage.

There is a lot of other coverage from the BBC news, fox news and others

Chemistry world has coverage

For Bandyopadhyay, this is just a starting point for building up to more complex assemblies of quinone molecules. 'Now the architecture is like a disk on a surface; I will build a spherical one and realize similar "one to many" communication on that structure's surface,' he says.

However, computation experts contacted by Chemistry World are not yet convinced that this is the way forward. It is not clear, one expert said, whether this system can actually perform parallel computation, or whether it only acts as a hub that distributes a signal. Without a clear demonstration of parallel computation, the work is 'clearly clever, but probably unimportant,' he said.


I think there are challenges ahead but it looks like it can be adapted to a parallel 3d architecture that does computation.

The Telegraph: A molecular machine has been devised as the potential brain of "nanobots" now under study for uses in medicine.


An image imagining an array of the devices


March 10, 2008

China is building more nuclear power 50+% faster than earlier plans


China will be building nuclear power at a faster rate than previously projected.

Zhang Guobao, a vice minister of the National Development and Reform Commission long involved in energy planning, said he now expected installed nuclear power capacity of 60 GW by that date. Previously it was mentioned that China would have 40GW completed and 18GW under construction and that total could increase by 8GW for more interior province nuclear build. By 2030, China had previously announced goals of 160GW and possibly 300 reactors and 300+ GW by 2050. Hopefully China will be able to substantially exceed the previously announced goals for 2030 and 2050 by a large margin as well.

This could mean that China might be the second or fourth largest user of nuclear power by 2020. France currently generates 63GW from 59 reactors.

France should have 67GW from 61 reactors in 2020.

Japan generates 47.5GW from 55 reactors now and is building 18GW of more nuclear power. So Japan should be getting 66GW from nuclear power in 2020.

The USA has he most with almost 100GW from nuclear power

The [chinese] government has announced plans to add an astonishing 1,300 GW to its electrical generation capacity by 2020. (The U.S. is currently capable of generating 1,000 GW.) The goal is for 25-30% of this to come from clean and renewable technologies. But even if these ambitious targets are achieved, some 70% of China's electricity will still come from coal-fired plants in 2020. That's down from about 78% today.

Today, light and heavy industry accounts for nearly three-quarters of the country's energy use. As a result, China is not a particularly efficient consumer of power, lagging well behind Japan, the U.S. and other developed countries in the amount of economic output it generates for every gigawatt consumed. Hoping to become 20% more energy-efficient over the next 12 years, Beijing in 2006 ordered heavy industries and local officials to develop more judicious consumption strategies. The government also increased pressure on provincial governments to enact strict building codes to make new office buildings and shopping centers less wasteful.


Areva, France's nuclear company, plans to build more than 100GW of new nuclear power by 2030.

Canada is building more nuclear reactors. Particularly in Ontario with4 or more planned to displace all the coal power generation.

UK Secretary John Hutton told the Financial Times he expected the new generation of nuclear power stations to supply much more of the country's electricity than the 19 percent the existing ones deliver.
Britain said it was making 18 more sites available for the next generation of nuclear power stations and gave operators four weeks to pick the ones they wanted.


Wired looked at previous targets for China's non-fossil fuel energy

Chinese Government Renewable Energy Targets for 2020
Hydro: 300 gigawatts
Nuclear: 40 GW [so now the target is 60GW)
Biomass: 30 GW
Wind: 30 GW
Solar: 1.8 GW

Here is a presentation in 2007 for an alternative energy plan for China from now until 2030.



A revised 2030 scenario could have China generating 300GW or more from nuclear power. Especially with significant uprating of new reactors using annular fuel (50% more power) or better reactor designs that are easier to mass produce. Like the IRIS nuclear reactor or the Fuji Molten salt reactor or the Uranium hydride nuclear ["battery"] reactor.

Possible Energy Targets for 2030 for China
Hydro: 350 gigawatts
Nuclear: 300+ GW - 2400+ billion kwh
Biomass: 100 GW
Wind: 100 GW
Solar: 10 GW
Conservation and efficiency: for 40% reduction
Bringing the Twh down to 5000-6000 billion kwh for 2030 could drastically reduce the role of coal power in China.
I had a previous detailed look at China's hydroelectric build out plans

Possible near term successful development of nuclear fusion would radically alter the energy generation situation for the better.

Computational protein design has developed enzymes from scratch

In a major step forward for computational protein design, University of Washington scientists have built from scratch a handful of enzymes that successfully catalyze a specific chemical reaction. These proteins have no naturally occurring counterparts, and the reaction--which breaks down a man-made chemical--has no natural catalyst.

David Baker and his colleagues at the University of Washington focused on a reaction that would break certain bonds between carbon atoms. The ability to design enzymes that can break and make carbon-carbon bonds could potentially enable scientists to break down environmental toxins, manufacture drugs, and create new fuels.

As they report in the journal Science, Baker and his group first designed what an ideal active site would look like for the reaction. An active site is a pocket within an enzyme where the catalyzed reaction takes place. In order to do its job, an active site must have precise geometry and chemical makeup, tailored to the reaction it catalyzes. Some components hold the reacting molecules in place, while others participate in the reaction's chemical mechanisms.



Once the researchers computed the active site, they used a newly developed set of algorithms to model proteins that have such a site. Each designed protein was ranked according to its ability to bind the reacting chemicals and hold them in the proper position.

The next step was to actually synthesize the selected proteins. The researchers derived gene sequences for 72 of the designed enzymes, ordered snippets of DNA containing those genes, and used bacteria to turn the genes into proteins. Each protein was then tested for its ability to catalyze the carbon-carbon bond breaking reaction.

Of the 72 proteins selected, 32 successfully helped along the reaction. The most efficient proteins sped up the reaction to 10,000 times the rate without an enzyme.

While that's an impressive feat compared with earlier enzyme design attempts, the synthesized enzymes pale in comparison to naturally occurring ones. "It's not very good at all," says Baker. "Naturally occurring enzymes can increase the rate of reactions by much, much greater amounts"--as much as a quadrillion-fold.

"One of our research problems is to figure out what's missing from our designs that naturally occurring enzymes have figured out," says Baker. In follow-up studies, his group has taken two approaches to this problem: refining its computer algorithms, and asking nature to step in where the researchers left off. By using their minimally functional enzymes as evolutionary starting points, the researchers can use directed evolution to create more efficient catalysts.

March 09, 2008

MIT cell sorting system

MIT has developed a simple, inexpensive system to sort different kinds of cells a process that could result in low-cost tools to test for diseases such as cancer, even in remote locations.

The method relies on the way cells sometimes interact with a surface (such as the wall of a blood vessel) by rolling along it. In the new device, a surface is coated with lines of a material that interacts with the cells, making it seem sticky to specific types of cells. The sticky lines are oriented diagonally to the flow of cell-containing fluid passing over the surface, so as certain kinds of cells respond to the coating they are nudged to one side, allowing them to be separated out.

Cancer cells, for example, can be separated from normal cells by this method, which could ultimately lead to a simple device for cancer screening. Stem cells also exhibit the same kind of selective response, so such devices could eventually be used in research labs to concentrate these cells for further study.


Normally, it takes an array of lab equipment and several separate steps to achieve this kind of separation of cells. This can make such methods impractical for widespread screening of blood samples in the field, especially in remote areas. �Our system is tailor-made for analysis of blood,� Karnik says. In addition, some kinds of cells, including stem cells, are very sensitive to external conditions, so this system could allow them to be concentrated with much less damage than with conventional multi-stage lab techniques.

Now that the basic principle has been harnessed in the lab, Karnik estimates it may take up to two years to develop into a standard device that could be used for laboratory research purposes. Because of the need for extensive testing, development of a device for clinical use could take about five years, he estimates.

Variable sized quantum dots could lead to more efficient and partially transparent solar cells

Electron transport through a structure of nanoparticles (left) and more ordered nanotubes (center) is shown. At right, different wavelengths of light can be absorbed by different-sized quantum dots layered in a “rainbow” solar cell. Image credit: Kongkanand, et al. ©2008 ACS.


Solar cells made of different-sized quantum dots, each tuned to a specific wavelength of light could be turned into 30% efficient solar energy producing colored windows

In the Notre Dame study, the scientists assembled cadmium selenide (CdSe) quantum dots in a single layer on the surface of nano films and tubes made of titanium dioxide (TiO2). After absorbing light, the quantum dots inject electrons into the TiO2 structures, which are then collected at a conducting electrode that generates photocurrent.

“Anchoring CdSe quantum dots on TiO2 nanotubes allowed us to create an ordered assembly of nanostructures,” Kamat told PhysOrg.com. “This architecture facilitated efficient transport of electrons to the collecting electrode surface and allowed us to achieve efficiency improvement.”

The researchers used four different sizes of quantum dots (between 2.3 and 3.7 nm in diameter) which exhibited absorbent peaks at different wavelengths (between 505 and 580 nm). The group observed a trade-off in performance corresponding with quantum dot size: smaller quantum dots could convert photons to electrons at a faster rate than larger quantum dots, but larger quantum dots absorbed a greater percentage of incoming photons than smaller dots. The 3-nm quantum dots offered the best compromise, but the researchers plan to improve both the conversion and absorption performances in future prototypes.

Besides investigating the quantum dots’ size quantization effect, the researchers also experimented with two different nano architectures – particle films and nanotubes – that act as scaffolds for transporting electrons from the quantum dots to the electrodes. The group found that the hollow 8000-nm-long nanotubes, where both the inner and outer surfaces were accessible to quantum dots, could transport electrons more efficiently than films.

“Usually, silicon-based photovoltaic panels operate with an efficiency of 15-20%,” Kamat said. “Silicon solar cells generate only one electron-hole pair per incident photons, irrespective of their energy. Thus, the higher energy of blue light is simply wasted in terms of heat. The obvious question is, can nanotechnology provide new ways to harvest these higher energy photons more efficiently?

“Semiconductor quantum dots seem to be the answer. They are capable of producing multiple charge carriers when excited with high energy light. If we succeed in capturing these charge carriers, we can expect significantly higher efficiencies. The target is to reach efficiency values greater than 30% using quantum dot rainbow solar cells.”

To achieve this efficiency, Kamat explained that there are two main challenges. The first is organizing the light harvesting nanostructures so that they efficiently absorb light in the visible and near infrared region, and transport electrons within the films. Secondly, the quantum dots should generate multiple charge carriers to be captured to generate photocurrent.