July 13, 2007

Accelerating Future on the Singularity and Transhumanist Technology

There are good articles at accelerating future.

1. Michael describes the confusion around the term the Singularity

2. He describes 10 top transhumanist technologies

I would expand out item number 8 Gene therapy/RNA interference

with synthetic biology and synthetic life and DNA nanotechnology.

I also think quantum computers will be quite important

Designer materials, new states of matter, control of spectrum will be key fundamental technologies that help enable some of the end applications and results on the list.

Energy technologies are also very important. Nuclear fission and fusion and advanced solar.

Constructing a lot of nuclear power plants is not material constrained

One of the common arguments that some in the environmental movement have against nuclear power is "we cannot make enough reactors" and if we do then the price will go up. I will show that historically the world build at a reasonable fast rate (28/year). Currently there are more and more nuclear reactors on order and are being completed at about 8 per year. The material used is a fraction of what is available and will not be a major impact on material prices. The alternatives of coal, wind and hydro also are highly intensive in steel and concrete. The main thing that impacts a nuclear reactors construction cost is length of time to build and interest rates. People do not understand how much steel and concrete is made each year in the world and how many buildings and things are built.

Building 1,000 one gigwatt nuclear plants per year would use less than 10% of the worlds annual concrete and steel. Modern nuclear reactors need less than 40 metric tons of steel and 190 cubic meters of concrete per megawatt of average capacity. 1,000 one gigawatt nuclear plants per year would need 40 million metric tons of steel and 190 million cubic meters of concrete. World supplies in 2006 are 1.24-billion tons of steel per year & 2.283 billion tons of coal per year.


del.icio.us



The historical record of plant construction shows that the United States by itself built:

12 nuclear plants were completed in 1974, 10 in 1973, 8 in 1972.
There were years in the eighties with 8 completed. Before 1968 only small reactors were built. Only two had over 400MW, but most were less than 100MW. 1969, 1970, 1971 had 3-4 each year, then in 1972 the 8 reactors. So from a relative standing start the scale up was rapid to the peak of 12/year of the last build cycle. We are in a better position now because US rebuilt a new nuclear plant and is switching on Browns Ferry 1 this year.

May, 2007 global reactor projections are an increase from february, 2007 of 3 being built, 12 planned and 52 proposed. A 21% increase from February. 286 reactors versus 219 reactors are now in the development pipeline.


Schedule about 6-8 completions per year worldwide for next few years.

Why would building more nuclear reactors each year increase costs ? What is the global limiting factors? We can make 100+ coal plants each year. Those as noted are about the same size and use twice as much steel and concrete.

Below are images of coal power plants and nuclear reactors. Notice that they are comparable in size. Coal power plants take 3-4 years to build and are being completed at about 2 per week.


This is a simplified coal plant


A Westinghouse AP1000 cutaway


An actual 1.58GW coal plant near Laughlin, Nevada. Note: each coal plant in the united states kills about 30-300 people from the resulting air pollution. Some plants are more polluting than others. The Laughlin coal plant produces 9 million tons of CO2 each year.

Coal Power plants are a major source of air pollution, with coal-fired power plants spewing 59% of total U.S. sulfur dioxide pollution and 18% of total nitrous oxides every year. Coal-fired power plants are also the largest polluter of toxic mercury pollution5, largest contributor of hazardous air toxics, and release about 50% of particle pollution. Additionally, power plants release over 40% of total U.S. carbon dioxide emissions, a prime contributor to global warming. Power plants are second only to automobiles as the greatest source of NOx emissions



Train loaded with coal, the Laughlin plant needs 42,000 train car loads per year

A 500 megawatt coal plant produces 3.5 billion kilowatt-hours per year, enough to power a city of about 140,000 people. It burns 1,430,000 tons of coal, uses 2.2 billion gallons of water and 146,000 tons of limestone.

A 500 MW coal plant each year puts out (triple these numbers for the Laughlin plant):
- 10,000 tons of sulfur dioxide. Sulfur dioxide (SOx) is the main cause of acid rain, which damages forests, lakes and buildings.

- 10,200 tons of nitrogen oxide. Nitrogen oxide (NOx) is a major cause of smog, and also a cause of acid rain.

- 500 tons of small particles which are a health hazard, causing lung damage.

- 125,000 tons of ash and 193,000 tons of sludge from the smokestack scrubber.

- 225 pounds of arsenic, 114 pounds of lead, 4 pounds of cadmium, and many other toxic heavy metal

- 2 tons of Uranium and 6 tons of Thorium


The large Tricastin enrichment plant in France (beyond cooling towers)
The four nuclear reactors in the foreground provide over 3000 MWe power for it.


One Westinghouse AP1000 nuclear reactor


5 megawatt offshore wind turbine, need 600 of these to equal a single 1 gigawatt nuclear reactor (wind has less operating load factor 30% versus 90%. The wind is not always blowing, but nuclear reactions are always happening)


Offshore windfarm


Offshore wind platform sizes

Oil platform size: dimensions of the platform are 103 x 99 meters.
Wind rotor diameters are about 90-100 meters for the 4-6 MW turbines. So wind rotor diameters are about the same width as an oil platform.

During the 70’s and 80’s there were years with 28 nuclear reactors built (1984) worldwide. The cost factors are primarily financing costs. High interests rates in the 1970s and 1980s were not good. Also, delays because of construction skrews ups. But large construction like high rise buildings and ocean liners and freighters are able to go on around the world at a far higher rate than most people understand.

The 434 nuclear reactors around the world are preventing the addition of 10% more global warming gases if that same energy was supplied with fossil fuels.
So it would seem that nuclear reactors are certainly helping to reduce global warming.

Wind and solar are not being built fast enough.
Wind which has been scaling up had a recent report in the Wall Street Journal that they are facing new build constraints. Supply problems. They cannot build as fast as they would like which is still less than the level needed to displace the new coal additions.

Coal is the main power source that will be added in China and the United States over the next few years.
In the USA (the official stats show that http://www.eia.doe.gov/) most of the 30 GW added each year has been natural gas power, but that from 2009 onwards it is shifting to coal power. That coal power will not be built with sequestering. It will mostly be the most polluting version which is pulverized coal.

Modern larger 1.5 GW to 2 GW nuclear reactors can be built. Build them at the old peak rate of 10 to 12 per year and the new coal that would have been added could be displaced.

The construction capacity would be a fraction of the overall building capacity of the US and the world.

World production and consumption of cement totaled 2.283 billion tons in 2005, an increase of about 5.75%, or 124 million tons, over the previous year. (All tonnage figures in this article are metric.) This continues the annual increases we have seen in almost every year since the 1970s.

The world has produced a record 1,24-billion tons of crude steel in 2006, some 8,8% more than in 2005.

1000 one GW nuclear reactors each year would be less than 10% of each of those materials. The steel and concrete used would be less than what is needed for comparable coal and wind.

From Per F. Peterson, Department of Nuclear Engineering. The Future of Nuclear Energy Policy: A California Perspective 2005:

Nuclear power plants built in the 1970’s used
40 metric tons of steel, and 190 cubic meters of concrete,
for each megawatt of average capacity.

Modern wind energy systems, with good wind conditions, take
460 metric tons of steel and 870 cubic meters of concrete per megawatt.

Modern central-station coal plants take
98 metric tons of steel and 160 cubic meters of concrete
—almost double the material needed to build nuclear power plants.

This is due to the massive size of coal plant boilers and pollution control equipment.

Conversely, natural gas combined cycle plants take
3.3 metric tons of steel and 27 cubic meters of concrete—
explaining why natural gas is such an attractive fuel, if it is cheap.
We are running out of natural gas.

The nuclear power plants that we built in the 1970’s were very efficient in their use of steel and concrete. In response to the Three Mile Island accident, however, “bloat” occurred in the designs of new, evolutionary reactors, with steel and concrete inputs increasing by 25 to 50 percent. This is the case for the ABWR, first built in Japan in the 1990’s, and for the EPR, the new European plant design which be built in Finland.

But a major change has occurred with the new nuclear plant designs that will be
built in the United States. These new designs—the ESBWR and the AP-1000—use
passive safety systems, that replace the external cooling supplies, large pumps, and diesel generators used for emergency cooling in the old plant designs with simple, gravity driven heat exchangers. These changes result in large reductions in steel and concrete inputs for these new passive plant designs—actually below the values of our 1970’s plants. Thus we can expect, if they are built in the time periods demonstrated in Japan, that these new nuclear plants can have the lowest construction costs of any reactors every built.


FURTHER READING:
There are high burn reactors which can use up 100% of the fuel instead of 0.7 to 2%.

I have looked at the costs of all energy sources

Money "spent" on nuclear energy was an investment that provided energy for consumers and profit for the companies.

Investment in energy provides energy at a price per kilowatt hour, which is how energy should be compared.

Oil and coal kill 3 million people each year because of air pollution. Oil is about 50% of that. Plus little wars over oil. Japan started WW2 with the USA over lack of access to oil. The Iraq wars.

Electric cars use electricity. So 50% electricity from coal means that half of the electric cars are batteries powered by coal. Electric cars are the way to go but you have to clean up the sources.

I have written a lot about coal

Status of breeder reactors and nuclear waste reprocessing

Nuclear waste analysis

Look at my past articles on nuclear power

Nuclear power and water

Getting the scope of the energy problem right

Nuclear proliferation has killed no one

Uranium is in coal waste

There are large uranium supplies

Money available for power infrastructure.

Federal is about 30% of total gov't spending.
$50 billion per year on infrastructure (1997 $) federal spending.
$175 billion per year infra + Education + r&D federal

10-20% of fed gov't
3-3.2% of GDP for federal + state + local on infrastructure
0.6-0.7% GDP on power infrastructure

50% on water
25% on transporation
25% on energy

Booz Allen Hamilton has an estimate of $41 trillion from 2005-2030 for urban cities worldwide on infrastructure. World spending is about $300 billion per year on power infrastructure. (2007)

the International Energy Agency is calling for 20 trillion to be spent on power infrastructure from now to 2030

54.7 trillion world economy in 2008.
With the current average of 5% world economic growth, the world economy would be 160 trillion in 2030. The 0.6% of GDP for power infrastructure would go from 328 billion in 2008 to 960 billion in 2030.

Advertising

Trading Futures
 
Nano Technology
 
Netbook     Technology News
 
Computer Software
   
Future Predictions

Breakthrough in understanding how embryonic stem cells function

Hat tip to KurzweilAI, the Toronto Star reports : a landmark discovery by researchers at McMaster University could radically alter the way scientists can use embryonic stem cells to grow replacement tissues and treat cancer.

In a surprise revelation, a McMaster study found that human embryonic stem cells – “the great grandmothers” of all the other cells in our bodies – build themselves a nurturing cocoon that feeds them and directs their ability to transform into other types of tissues. And by manipulating the products of this tiny, cellular placenta, it may be possible for scientists to prompt the stem cells to grow into desired tissues and organs, or to switch off tumour growth in cancers, says Mickie Bhatia, the lead study author.

The study shows that making replacement tissues to treat disease requires more than just the manipulation of the stem cell itself.

“You have to control the surrounding cells that govern this (transformation) process,” Bhatia says.

Bhatia says the major reason scientists study embryonic stem cells is for their potential to generate new types of tissues for transplant into damaged organs.

July 12, 2007

Synthetic life projects

Venter's is trying to create a synthetic genome that will be housed within an existing bacterial cell, other scientists are aiming for the even more ambitious target of building an entire living cell from the basic chemical ingredients. However, George Church has a grander and I think more intersting vision.



George Church at Harvard Medical School in Boston, has devised a complete blueprint for a synthetic cell, an investment of around $10 million would be enough to turn the "bottom-up" dream into reality. "Our approach doesn't require any super new technology," he says.

In 2006 Church, working with Tony Forster of Vanderbilt University in Nashville, Tennessee, published a detailed blueprint for assembling a synthetic cell from scratch (Molecular Systems Biology, DOI: 10.1038/msb4100090). It includes 115 genes (133,000 base pairs)which would be combined with various biochemicals to make a self-assembling cell able to live under carefully controlled lab conditions. The details have still to be worked out, but Church believes there should be no fundamental barriers. He sees the team's artificial organism becoming a workhorse for biotechnology that could be adapted to do useful tasks such as making complex biochemicals.


How far can chemical self assembly be pushed ?
Self-assembly in vitro of viruses and the ribosome, achieved decades ago, taught us some of the principles assumed to be used in general by cells (Lewin, 2004). For example, self-assembly occurs in a definite sequence and is generally energetically favored, obviating the need for enzymes and an energy source. Assembling some type of cell (i.e. a self-replicating, membrane-encapsulated collection of biomolecules) would seem to be the next major step, yet detailed plans have not been published.

They have a stepwise biochemical approach that should lead to the eventual identification of any remaining functions essential for the synthesis of a minimal cell sustained solely by small molecules.



A minimal cell containing biological macromolecules and pathways proposed to be necessary and sufficient for replication from small molecule nutrients. The macromolecules are all nucleic acid and protein polymers and are encapsulated within a bilayer lipid vesicle. The small molecules (brown) diffuse across the bilayer. The macromolecules are ordered according to the pathways in which they are synthesized and act. They are colored by biochemical subsystem as follows: blue=DNA synthesis, red=RNA synthesis and cleavage, green=RNA modification, purple=ribosome assembly, orange=post-translational modification and black=protein synthesis. MFT=methionyl-tRNAfMeti formyltransferase. The system could be bootstrapped with DNA, RNA polymerase, ribosome, translation factors, tRNAs, MTF, synthetases, chaperones and small molecules.


All nucleoside modifications of all 33 synthetic tRNAs that may be sufficient for accurate translation
Murtas and his team have managed to initiate the process of protein synthesis in cell-like self-assembling spheres bounded by lipid membranes, known as "liposomes". A similar feat was achieved in 2004 by Vincent Noireaux and Albert Libchaber of Rockefeller University in New York, but while they seeded their lipid vesicles with an extract of Escherichia coli bacterial cells, Murtas and his colleagues used a recipe of 37 enzymes and a range of smaller molecules to enable protein synthesis.


So when are we likely to see unequivocally synthetic life, with the entire cell built from scratch? "It could be five months or 10 years," says Church. "These things aren't so much a question of timescales as the amount of money available."

Maximum lifespan increasing

A demographic study from 2000 that shows that maximum lifespan is increasing and it is increasing at an accelerating rate

In the 1860s in Sweden, the oldest ages at death for men and women centered around 101. That average maximum age moved up slowly throughout the century to about 105 in the 1960s and then accelerated to 108 in the 1990s.

Historical records, on the other hand, show that the entire configuration of ages at death in Sweden has been shifting upward for 138 years, he said. The upward trend accelerated suddenly around 1970, more than doubling the rate at which the life span was growing, from less than one year of age for every two decades to more than one year per decade.

This has happened because of medical and public health advances throughout the century, said Wilmoth, whose analysis ruled out simple population growth as a factor. Some scientists had thought that the increased number of very old people could be due to a larger population base, but Wilmoth's data show that the main cause is increased survival after age 70.


FURTHER READING:
Life expectancy at wikipedia

Maximum life span at wikipedia

Longevity study of Germans

Centenarians in wikipedia

From present data, the number of worldwide Centenarians is around 450,000. However, if one considers only the total number of Supercentenarians (by definition, persons surviving to ≥ 110 years) this number falls dramatically to estimated 300-450 worldwide, in which only around 70 are validated.


Supercentenarians in wikipedia

A supercentenarian (sometimes hyphenated as super-centenarian) is someone who has reached the age of 110 years or more, something achieved by only one in a thousand centenarians (based on European data). In turn, only about one supercentenarian in 44 lives to turn 115, or 2% of 110-year-olds can expect to survive five more years.

New Superlens design could be easier to make

A new kind of "superlens", capable of focusing light to a spot far smaller than its own wavelength, could be far easier to build than other proposed designs, researchers say. It could allow viewing and etching of points and lines at 0.4 to 1 nanometer. This would help accelerate the improvement of conventional computer power and the development of molecular manufacturing.




Roberto Merlin of the University of Michigan has devised a different way of making a superlens that promises to focus light more efficiently, and to an even smaller spot – perhaps 500 times smaller than light's wavelength. Also, compared with conventional metamaterial lenses, this new kind "would definitely be easier to make," Merlin says.

His theoretical study shows that a more effective superlens could be made from a thin plate containing concentric rings made of two different materials. This would be reminiscent of a tree trunk and its annual growth rings. The rings would alternate between material that blocks light – such as metal – and rings that let light through, like silicon or glass.

Overall, the set of rings should "sculpt" light emerging from the plate to creating a focused point of light. And using this method ought to be much more flexible than other ways of making superlenses, Merlin says. In addition to focusing light into a point, "I could make a line," he adds. "I could probably write 'University of Michigan'."

In metamaterial lenses, "loss [of light] is unavoidable", says Willie Padilla of Boston College in Massachusetts. "The approach that Merlin is proposing can avoid these losses."


Light spectrum wavelengths
violet 380–450 nm
blue 450–495 nm
green 495–570 nm
yellow 570–590 nm
orange 590–620 nm
red 620–750 nm


Spots 500 times less would put green to violet light at 1 nanometer.
Ultraviolet LEDs and lasers with 240 nm would make spots 0.48 nanometers.

Applications include higher density data storage on optical discs and more precise lithography – the process used to make computer chips.



With the new technology, a CD could hold up to one hundred times more information by using terahertz radiation rather than visible light, even though the length of a terahertz wave is about 1000 times longer.

Japan has made highly efficient ultraviolet light emitting semiconductor

Scientists at Japan's National Institute of Advanced Industrial Science and Technology claim to have developed a new highly efficient ultraviolet light emitting semiconductor. The key is have they made it long lasting and stable ? If they have then the benefits of zinc over gallium LEDs would be realized. The benefits would be lower costs and higher efficiency.

The new technology relies on a zinc oxide compound combined with minute quantities of magnesium oxide. This is claimed to offer several advantages over existing materials used for similar devices.

A breakthrough in the fundamental technology behind devices like LEDs and lasers could lead to advances in a wide range of products, including optical disks, light sources and flat-panel displays.


In 2005, Japan had ultraviolet LEDs that emitted light at 255 to 340 nm


Name Abbreviation Wavelength range in nanometres
Near NUV 400 nm - 200 nm
UVA, long wave, or black light 400 nm - 320 nm
UVB or medium wave 320 nm - 280 nm
UVC, short wave, or germicidal Below 280 nm
Far or vacuum FUV, VUV 200 nm - 10 nm
Extreme or deep EUV, XUV 31 nm - 1 nm


IEEE spectrum has an article from MArch 2007 about LEDs "Beyond blue"

The main hurdle to making zinc oxide devices has been getting stable, reliable p-type material—material with an excess of holes, or electron deficiencies. Making an LED or laser diode requires a junction between p-type and n-type material. But when some of the zinc oxide is engineered to act as p-type material, it tends to revert to its natural n-type state after a few months, which would cause a device to fail. In contrast, blue LEDs made from gallium nitride have expected lifetimes of 100 000 hours, or over 10 years.

Henry White, a professor of physics at the University of Missouri and a cofounder of MOXtronics, says that the new LEDs have the potential to reach wavelengths as low as 200 nm, which is deep in the UV region. He expects the devices’ efficiencies and output power to compete with those of today’s white LEDs, made of gallium nitride, in two to three years. The company is also in the process of making UV laser diodes, he says

Zinc oxide has a very good shot at ­meeting the difficult demands of the solid-state white light market, which analysts predict will dominate over incandescent and fluorescent bulbs by 2025, saving US $150 billion a year in power in the United States alone.

Both Zhang and Look single out the imperative need for a convenient way of making p-type material that lasts for more than two years.

11th Carnival of Space

The 11th Carnival of space is up at space4commerce.blogspot.com which is run by Brian Dunbar

Once again I have a posting in the space carnival. I give a review of the current state of space funding and development in the USA and briefly for other major space organizations

The Mars Society of Germany has proposed ARCHIMEDES an interesting balloon based observation system

Colony Worlds discusses the danger of asteroid space mining for humans

This is why I believe that robotics and advanced automation should be creatively examined and to start by exploiting near earth objects. Although Mars and lunar resources should also be used since we are and will be sending robotic probes there regularly.

An earlier analysis of Space mining where robotic techniques are considered

Another examination of advanced automation for space missions

Near Earth Orbit mining is considered

IEEE spectrum examined space mining in August, 2001. The idea was to start with mining trapped ice on near earth objects and then proceed to metals mining.

I think more creative engineering design and systems design needs to be performed. The goal of the design should be adapt process steps that are more easily deployed and utilized in space environments.

FURTHER READING:
A 200 some page pdf with about 500 abstracts on reports dealing with recovery and utilization of space resources

144 page Lunares study from 2004

July 11, 2007

Nanotechnology enhanced domed cities versus nuclear weapons

I will explore how a series of dome shells over cities could be made with molecularly precise materials that would be resistant to nuclear weapons.

Note: This is only a speculative thought experiment. Even if you had the required nanotechnology, making the defense system would be a wasteful and provocative act. The economic abundance that can be provided by nanotechnology can be used to remove the motivations and needs for conflicts and arms race competitions. Although I will describe a partially successful defense against some nuclear weapons, a foe that has nukes and decent nanotechnology could defeat this. Plus just building a lot of nukes and firing a sequenced multiple shot would penetrate and damage the protective domes and overwhelm the protection.


del.icio.us




One mile wide Buckminster Fuller proposed dome

Buckminster Fuller appears to have made the first specific proposal in 1965. Claiming that the geodesic dome had no practical limit on its size, he described a glass-panelled dome 3 km in diameter and 1.6 km tall spanning a portion of Manhattan Island; he claimed that it would reduce air pollution and provide comfortable weather all year. The dome would not fully enclose the area beneath it, as with most fictional domes, but would float on air at roughly the height of a contemporary skyscraper


The goal of the largest outer dome would be to resist unexploded nuclear devices from penetrating without being forced to explode. Then the hardened inner shells would resist the resulting nuclear blasts. So in this case the outer shells would fully enclose the city. They could have large panels that are open during times of peace.

An active outer shell with reactive armor or electric reactive armor.

The largest domes that have already been constructed:
The largest mast supported dome is the Millenium dome which is 365 meters in diameter.

Largest concrete reinforced dome 71 meters

Geodesic domes have been made 216 meters in size

Nanotechnology can make materials one hundred times stronger than ordinary concrete and rebar. Molecular manufacturing could make large dome shells of stronger materials.

How strong can the nanotech materials be?
Buried structures in the 1960s were made with 55psi. ICBM silos apparently had 2000psi. Command bunkers maybe 4000psi They were able to survive near hits within a few hundred meters.

Materials with higher compressive strength would help make a dome tougher.

On wikipedia discusses superconcrete

Several blocks of concretes were demonstrated with abnormally high compressive strengths between 50,000 and 60,000 PSI at 28 days. The blocks appeared to use an aggregate of steel fibres and quartz -- a mineral with a compressive strength of 160,000 PSI, much higher than typical high-strength aggregates such as granite (15,000-20,000 PSI).


Compressive strength additives with 160,000 PSI is about 1.1 GPa.

Boron carbide has a compressive strength of 440,000 psi Almost three times the strength of the reinforcing material in superconcrete.

The ultimate compressive strength of c-BN (cubic boron nitride) is between 4.15 and 5.33 gpa. So about 4 times better than steel fibres and quartz.

Diamond is still almost twice as good as cubic boron nitride and 8 to 10 times better than quartz. Mature molecular manufacturing and nanotechnology can use diamond as a building material.

This site had an article that discussed how Monolithic Domes that are buried up to 30 feet deep are able to withstand pressures up to 1 ton per square foot (2000 psf). This is 13 psi with relatively ordinary concrete and rebar (special concrete can be 15 to 20 times better now and diamond and nanomaterials about 150 times). The diamond and nanomaterial dome would be city sized and have an overall strength of about 2000 psi.

Nuclear weapons overpressure
An online nuclear blast calculator

For a one megaton bomb
15 psi: 2.46 km
5 psi: 4.52 km
2 psi: 7.92 km
1 psi: 11.67 km



One megaton blast radii

It seems to be about pi (3.1416) times the overpressure for half the distance closer
PSI     meters away
47 1230
148 615
465 307.5
1461 153.8
4590 76.9


The outer shell would just need to resist penetration by an unexploded nuke missile and force it to detonate.

The next shell (a monolithic dome with say 1300psi) would 200 meters farther in would then be able to resist the blast.


Big nuclear explosion. Castle Bravo Blast

50 megatons
15 psi: 9.07 km
5 psi: 16.66 km
2 psi: 29.16 km
1 psi: 42.98 km


PSI     Meters from ground zero
47 4535
148 2267.5
465 1134
1461 567
4590 283


Another shell 600 meters smaller than outside radius of largest shell
to resist a 50 megaton blast.

If the outer shell is a geodesic dome, then when pieces are knocked out then other pieces could be ready to be moved quickly back into place to regenerate the protection.

There would be other active defences to try and shoot down missiles and attacks.

FURTHER READING:

Here is a chapter from an online reference on high performance
concrete with formulas.


nuclear bomb effects calculations

India has a home grown nuclear fission Thorium reactor design

A novel Fast Thorium Breeder Reactor (FTBR) being developed by V. Jagannathan and his team at the Bhabha Atomic Research Centre (BARC) in Mumbai has received global attention after a paper was submitted to the International Conference on Emerging Nuclear Energy Systems (ICENES) held June 9-14 in Istanbul.

They believe their FTBR is one such 'candidate' reactor that can produce energy from these two fertile materials with some help from fissile plutonium as a 'seed' to start the fire.

By using a judicious mix of 'seed' plutonium and fertile zones inside the core, the scientists show theoretically that their design can breed not one but two nuclear fuels - U-233 from thorium and plutonium from depleted uranium - within the same reactor.

This totally novel concept of fertile-to-fissile conversion has prompted its designers to christen their baby the Fast 'Twin' Breeder Reactor.

Their calculations show the sodium-cooled FTBR, while consuming 10.96 tonnes of plutonium to generate 1,000 MW of power, breeds 11.44 tonnes of plutonium and 0.88 tonnes of U-233 in a cycle length of two years.

'At present, there are no internal fertile blankets or fissile breeding zones in power reactors operating in the world,' the paper claims.

The concept has won praise from nuclear experts elsewhere. 'Core heterogeneity is the best way to help high conversion,' says Alexis Nuttin, a French nuclear scientist at the LPSC Reactor Physics Group in Grenoble.

Thorium-based fuels and fuel cycles have been used in the past and are being developed in a few countries but are yet to be commercialised.

France is also studying a concept of 'molten salt reactor' where the fuel is in liquid form, while the US is considering a gas-cooled reactor using thorium. McLean, Virginia-based Thorium Power Ltd of the US, has been working with nuclear engineers and scientists of the Kurchatov Institute in Moscow for over a decade to develop designs that can be commercialised.

India does not have sufficient uranium to build enough thermal reactors to produce the plutonium needed for more FBRs of the Kalpakkam type.

'Jagannathan's design is one way of utilising thorium and circumventing the delays in building plutonium-based FBRs,' says former BARC director P.K. Iyengar.

July 10, 2007

Monolithic Domes for military, space and polar regions


Monolithic dome structure

The dome structures can be built quickly and are very strong.


del.icio.us




The Monolithic Dome is the most disaster resistant building that can be built at a reasonable price without going underground or into a mountain.

A wind of 70 miles per hour blowing against a 30 foot tall flat walled building in open flat terrain will exert a pressure of 22 pounds per square foot. If the wind speed is increased to 300 miles per hour the pressure is increased to 404 pounds per square foot (psf). Wind speed of 300 MPH is considered maximum for a tornado. It is far greater than that of a hurricane.

Cars can be parked on 100 psf. The side pressure on the building could equal the weight of cars piled 4 high. No normal building can withstand that much pressure. Many Monolithic Domes are buried up to 30 feet deep. They must withstand pressures up to 1 ton per square foot (2000 psf)].

Against tornado pressure a Monolithic Dome 100 feet in diameter, 35 feet tall would still have a safety margin of nearly 1½ times its minimum design strength. In other words, the stress created by the 300 mile per hour wind would increase the compressive pressure in the concrete shell to 1,098 psi. The shell is allowed 2,394 psi using design strengths of 4,000 psi.

The fact is the Monolithic™ Dome is not flat and therefore never could the maximum air pressure against it of 404 pounds per square foot be realized. Neither is the concrete only 4,000 psi. It is always much greater. The margin of safety is probably more like three or four.


The dome buildings are also highly energy efficient

The initial cost of a Monolithic Dome is usually the same as a custom built, conventional home of equal interior finish. If you planned on buying a $100,000 house, you will probably have to pay $100,000 for your dome home.

Monolithic Domes are built to high standards. All standard US homes are built as Type V fire rated structures. Which means they are built entirely of combustible materials. One match and it's gone. A dome is fire rated at Type II or better. It just doesn't burn. The contents inside may, but the overall fire safety is incredibly high. This can save money in the long term by lowering the homeowners insurance policy.

Using three inches of polyurethane foam on the outside of three inches of concrete makes the dome extremely energy efficient. Monolithic Domes require only half or less energy to heat and cool. One homeowner moved from a 1400 square foot conventional home to a 2700 square foot Monolithic Dome. His energy bill remained the same although the dome was twice as big.

A Monolithic Dome is not susceptible to termites and other creatures. It won't rot. It won't get blown away or knocked down. Mold is not a serious problem. These are only some examples of the Monolithic Dome's advantages.


However, the long-term, day-to-day costs of a Monolithic Dome will be always be lower. And the true cost of owning a dome home is substantially less.


70' x 54', luxurious, beachfront property that is the home of Valerie and Mark Sigler, as well as a bed and breakfast

The Army's Rapid Equipping Force is looking at using the domes instead of tents.


Domes may eventually be deployed to forward operating bases at U.S. Central Command. Development is scheduled through August, 2007, testing is slated for September and a final decision would be made in April 2008.


Domes can be built resistant to small arms fire like rifles

A Mosque in Iraq was built from a monolithic dome. The structure survived a 5000lb bomb, although it will need extensive interior repair.

With the help of Iraqi laborers, it took only 4 1/2 months to complete 28 domes. A large group of Iraqis were taught to spray polyurethane foam while a Canadian crew hung steel and sprayed concrete. With their combined efforts they were able to complete one of the grain storage domes in just 4 1/2 days.


A future alternative is to use nanomaterials for the reinforcement instead of steel rebar. Currently kevlar and other materials could be used but have higher costs

FURTHER READING:
NASA is looking at inflatable structures (other makers) for lunar bases


The "planetary surface habitat and airlock unit"

Inflatable bases for the artic Bolonkin and Cathcart have attracted attention with their artic proposal, but a prototype has not yet been constructed.

Economists allege that the mean 2006 USA Dollar value of Polar Region land territory is generally low compared to the world total of ~$250,000/km2. For example: Antarctica ~$40/km2, Greenland ~$650/km2, Canada ~$77,000/km2 and Russia ~$106,000/km2. However, world economic productivity data show that the 2006 USA dollar output per capita in the Earth-biosphere is greatest in Polar Regions; cold regions have output per capita that is approximately 10-12 times that of the Earth’s Tropic Zones.

Suggest initial macroprojects could be small (10 m diameter) houses followed by an “Evergreen” dome in the Arctic or Antarctica covering a land area 200 X 1000m, with irrigated vegetation, homes, open-air swimming pools, playground. The house and “Evergreen” dome have several innovations: Sun reflector, double transparent insulating film, controllable jalousies coated with reflective aluminum and an electronic cable mesh inherent to the film for dome safety/integrity monitoring purposes. By undertaking to construct a halfsphere house, we can acquire experience in such constructions and explore more complex constructions. By computation, a 10 m diameter home has a useful floor area of 78.5 m2, airy interior volume of 262 m3 covered by an envelope with an exterior area of 157 m2. It film enclosure material would have a thickness of 0.0002 mm with a total mass of 65 kg. A city-enclosing “Evergreen” dome of 200 X 1000 m could have calculated characteristics: useful area = 2.3 X 10**5 m2, useful volume 17.8 X 10**6 m3, exterior dome area of 3.75 X 10**5 m2 comprised of a film of 0.0002 mm thickness and 145 tonnes. If the “Evergreen" dome were formed with concrete 0.2 m thick, the mass of the city-size envelope would be 173,000 tonnes, which is a thousand times heavier.

California population projected at 60 million in 2050

I had an article about USA and California's population projected in Q1 of 2007.

There is a new San francisco Chronicle article about California expected population.

In the new report, state demographers used the latest county population estimates as a baseline to make assumptions about future migration patterns. The formulas they used accounted for undocumented immigrants, Martindale said.

Future predictions also have to take the economy into account, he said. It's unclear how many Hispanics will move out of California by 2042, especially if the state remains one of the country's most expensive places to live.

By midcentury, it's estimated that Hispanics will comprise 52 percent of California's 59.5 million residents.



The projections are from the California Department of Finance, which tends to have higher projections than some other sources. I mostly agree with these projections

Software and power key to supercomputers beyond petaflop

IBM and Sun have announced petaflop supercomputers. They indicate that there are no barriers to more computer power but the most important issues going forward are to manage and reduce power usage and to improve software.

"The hardware speed [of supercomputers] will not reach a plateau," said Simon See, Sun Microsystems' director for Advance Computing Solution, System Practice and Global Science and Tech Network, in an e-mail interview with ZDNet Asia. "However, what might prevent effectiveness could be the software layer on the hardware."

"Accelerator chips speed up performance by taking over some computing tasks from the main processor," he explained. "CBE is good at graphics, so a CBE accelerator would perform graphics-intensive calculations."

Dunn noted that while there does not appear to be any engineering or technical hurdles that would keep supercomputing speeds from gaining, "the biggest issue is probably the ability to fund the necessary innovations".



Scaling up to a petaflop with IBM's Blue Gene/P

July 09, 2007

Air pollution index that is tied to health risk

Toronto, Canada is introducing an air pollution index that is correlated to health risk This is a good way to make the health risks of air pollution something that people can be aware of on a daily basis. If this is used globally it could help movitivate replacing or cleaning up coal and fossil fuel (diesel oil, gasoline) power sources. The fastest ways are through a combination of conservation, nuclear power, hydro power and renewable power.

DNA synthesis costs and projections

DNA Synthesis - Productivity of DNA synthesis technologies has increased approximately 7,000-fold over the past 15 years, doubling every 14 months.
Costs of gene synthesis per bases pair have fallen 50-fold, halving every 32 months. At the same time, the accuracy of gene synthesis technologies has improved significantly.

If these technologies continue to improve at the exponential rates achieved historically, the cost of sequencing will fall to less than $0.01 per base pair by 2010 and the cost of gene synthesis will fall to less than $0.10 per base pair.

However,George Church in a 2006 interview indicated the following prices (already lower than the projection for 2010:


Right now the cost of synthesizing a base [using conventional technology] is about 10 cents. That's the current street price for raw oligonucleotides. For synthesizing simple genes, it's more like $1.30 a base. [Our method] can manufacture oligonucleotides at .01 cent per base.


it [DNA sequencing and synthesis] is actually a very fundamental concept. There is almost no synthesis that doesn't involve sequencing, and vice versa. And that is why I have really emphasized this connection in my lab. They are very synergistic.


Projected DNA synthesis costs (halving cost about every 2.5 years)

200610 cents per BP
2007 0.01 to 5 cents per BP
2010 0.005 to 2 cents per BP
2013 0.002 to 1 cents per BP
2015 0.001 to 0.5 cents per BP
2017 0.0005 to 0.02 cents per BP


2007 100 BP synthesized for 1 cent
2010200 BP for 1 cent
2013400 BP for 1 cent
2015 1000 BP for 1 cent
20172000 BP for 1 cent


Projected DNA synthesis maximum lengths (doubling every 14 months)


YEARMaximum sequence length
200745K (common) to 580K BP (record)
2009 200K (common) to 2M BP
2010400K (common) to 4M BP
20121.5M BP to 16M BP
20146M BP to 64M BP
201625M BP to 250M BP


The global market for DNA sequencing technology and related services exceeded $7 billion in 2006. The market for synthesis reagents and synthesis services is nearly $1 billion.

Synthetic biology uses viruses to fight biofilms

Synthetic biology used to make viruses to combat harmful 'biofilms'


This diagram shows how an engineered virus,T7, destroys a biofilm composed of E. coli bacteria. Graphic courtesy / Timothy Lu and James Collins

In one of the first potential applications of synthetic biology, an emerging field that aims to design and build useful biomolecular systems, researchers from MIT and Boston University are engineering viruses to attack and destroy the surface "biofilms" that harbor harmful bacteria in the body and on industrial and medical devices.

They have already successfully demonstrated one such virus, and thanks to a "plug and play" library of "parts" believe that many more could be custom-designed to target different species or strains of bacteria.

The work, reported in the July 3 Proceedings of the National Academy of Sciences, helps vault synthetic biology from an abstract science to one that has proven practical applications. "Our results show we can do simple things with synthetic biology that have potentially useful results," says first author Timothy Lu, a doctoral student in the Harvard-MIT Division of Health Sciences and Technology.

They found that their engineered phage eliminated 99.997% of the bacterial biofilm cells, an improvement by two orders of magnitude over the phage's nonengineered cousin.

"We hope in a few years, it will be easy to create libraries of phage that we know have a good chance of working a priori because we know so much about their inner-workings," says Lu.

Synthetic biology also makes it possible to control the timing of when a gene is expressed in an organism. For instance, Lu inserted the DspB genes into a precise location in the T7 genome so that the phage would strongly express it during infection rather than before or after.

Though phages are not approved for use in humans in the United States, recently the FDA approved a phage cocktail to treat Listeria monocytogenes on lunchmeat. This makes certain applications, such as cleaning products that include phages to clear slime in food processing plants, more immediately promising. Another potential application: phage-containing drugs for use in livestock in exchange for or in combination with antibiotics.

Synthetic biology and synthetic life milestones

Synthetic biology and synthetic life are about to make several major milestones

Note: the synthesis of 580,000 base pairs of DNA is significant. 3 million base pairs make up a ribosome. Being able to synthesize 580,000 base pairs in 2007 suggests that we could be 1 to 2 years from synthesizing our own ribosomes. It could also mean that there may not be serious hurdles to very long synthesizing of millions and billions of base pairs. There is the question of error rates, but synthesize sequences could be error corrected. This could be a powerful bootstrapping method to achieving molecular manufacturing. DNA nanotechnology would see rapid leaps in capability.

Synthesizing and replacing a bacteria's genome and sequencing DNA 13-15 times longer than previous record:
Scientists at the J. Craig Venter Institute in Rockville, Md., hope to take a giant stride in synthetic biology by creating a piece of DNA 580,076 units in length from simple chemicals, chiefly the material that constitutes DNA’s four-letter chemical alphabet. This molecule would be an exact copy of the genome of a small bacterium. Dr. Venter says he then plans to insert it into a bacterial cell. If this man-made genome can take over the cell’s functions, Dr. Venter should be able to claim he has made the first synthetic cell.

Though human cells effortlessly duplicate a genome of three billion units, the longest piece of DNA synthesized so far is just 35,000 [I have seen papers claiming 45,000 base pairs] units long.


Synthetic biologists have lofty goals

Adherents of the [synthetic biology] held their third annual conference last month in Zurich but their creations are still at the toy rocket stage. A dish of bacteria that generates a bull’s eye pattern in response to the chemicals in its environment. A network of genes that synthesizes the precursor chemical to artemisin, an anti-malaria drug. “The understanding of networks and pathways is really in its infancy and will be a challenge for decades,” says James J. Collins, a biomedical engineer at Boston University.

That hasn’t stopped synthetic biologists from dreaming. “Grow a house” is on the to-do list of the M.I.T. Synthetic Biology Working Group, presumably meaning that an acorn might be reprogrammed to generate walls, oak floors and a roof instead of the usual trunk and branches. “Take over Mars. And then Venus. And then Earth” —the last items on this modest agenda.

“The real killer app for this field has become bioenergy,” Dr. Collins says. Under the stimulus of high gas prices, synthetic biologists are re-engineering microbes to generate the components of natural gas and petroleum. Whether this can be done economically remains to be seen. But one company, LS9 of San Carlos, Calif., says it is close to that goal. Its re-engineered microbe “produces hydrocarbons that look, smell and function” very similarly to those in petroleum, said Stephen del Cardayre, the company’s vice president for research.


FURTHER READING

Cost of DNA synthesis. 10 to 70 cents per base pair in early 2007 (maybe one cent per BP with George Church process) and projected to be about 1/2000th of one cent per base pair in 2016

it [DNA sequencing and synthesis] is actually a very fundamental concept. There is almost no synthesis that doesn't involve sequencing, and vice versa. And that is why I have really emphasized this connection in my lab. They are very synergistic.


Synthetic biology making viruses that are over 100 times more effecive at fighting biofilms

DNA factories being made that are anticipating the DNA synthesis boom

Codon Devices website

Ethics and Gene Therapy

A New York Times review of THE CASE AGAINST PERFECTION Ethics in the Age of Genetic Engineering. By Michael J. Sandel.

When norms change, you can always find old fogeys who grouse that things aren’t the way they used to be. In the case of football, Sandel finds a retired N.F.L. player to support his contention that today’s bulked-up linemen are “degrading to the game” and to players’ “dignity.” But eventually, the old fogeys die out, and the new norms solidify. Sandel recalls a scene from the movie “Chariots of Fire,” set in the years before the 1924 Olympics, in which a runner was rebuked for using a coach. Supposedly, this violated the spirit of amateur competition. Today, nobody blinks at running coaches. The standpoint from which people used to find them unseemly is gone.

To defend the old ways against the new, Sandel needs something deeper: a common foundation for the various norms in sports, arts and parenting. He thinks he has found it in the idea of giftedness. To some degree, being a good parent, athlete or performer is about accepting and cherishing the raw material you’ve been given to work with.


I view the difference between having gene therapy and not having it: like 5 card draw which we currently play where we all get random cards to a change towards Omaha. Choosing the best of two out of four cards that you are dealt and choosing the best three out of five community cards. Where we know which cards are best everyone would gravitate towards taking royal flushes, but just having the genes does not control the environmental factors. So sometimes it would have been better to have a full house or four of a kind based on what happens in the environment. How to judge the cards is still not clearly defined. If it turns out that low hand wins then a more flexible approach would be to take a low flush A, 2, 3, 4, 5 a little wheel that is a good low hand and a good high hand.

July 08, 2007

China Yuan rising

Goldman Sachs strategists predict the People's Bank of China will let the currency advance 7.5 percent in the coming year, while JPMorgan predicts the yuan will climb 10.6 percent by March.

The yuan appreciated 1.5 percent last quarter as prices for items such as food, rent and transportation increased the most in 27 months. Goldman Sachs Group Inc. and JPMorgan Chase & Co. predict that inflation will force America's second-largest trading partner to let its currency strengthen at least 7.5 percent in the next year, more than twice as much as in 2006.


If this currency appreciation occurs and continues then my prediction of Chinas economy passing the USA economy in overall size on an exchange rate basis by 2020 will occur.

Форма для связи

Name

Email *

Message *