Pages

August 22, 2008

Progress on India's Thorium Nuclear Reactor and South Africa's Pebble Bed

Progress on South Africa's Pebble Bed Reactor
Canada's SNC Lavalin has gotten a C$253 million contract to help build the second phase of a demonstration Pebble Bed Modular Reactor (PBMR) for completion by 2014 in Koeberg, South Africa. The small advanced reactor, a South African national project, would produce 165 MWe and could be built in 'packs' of eight. It is hoped that up to 30 of the units would be used in South Africa in coming decades, taking industrial heat-supply roles in the production of hydrogen and synthetic oils as well as electricity. The PBMR design is also a contender for build in the USA in the Next Generation Nuclear Plant project.

India's Thorium Reactor
The head of the Mumbai reactor design and development group, Ratan Kumar Sinha, spoke to IEEE Spectrum about India's Thorium reactor design and plans. The Thorium reactor will have less waste (unburned fuel) than current reactors and is designed to operate for 100 years instead of 30-60 years for current reactors.

In April, 2008, India started a test reactor for its Thorium design, which has a flexible configuration and allows use of a range of fuel materials; we can even physically shift the distance between fuel rods. Here we are able to simulate the reactor almost 100 percent.

They have used the well-proven pressure-tube technology and introduced many passive safety features, a distinguishing one being the reactor's ability to remove core heat by natural circulation of coolant under normal operating and shutdown conditions. This eliminates the need for nuclear-grade circulating pumps, which, besides providing economic advantages, enhances reliability.

They have also introduced passive shutdown on the main heat transport system in case of a failure of the wired shutdown system. Using mechanical energy from the increased steam pressure, the system injects neutron poison into the moderator [that sustains the nuclear chain reaction]. There are several other safety features, which are important, because they allow the reactor to be built close to the population.

Sinha: This is a vertical, pressure-tube-type, heavy-water-moderated, and boiling-light-water-cooled natural circulation reactor. The fuel assembly is 10.5 meters in length and is suspended from the top in the coolant channel. The fuel cluster has 54 pins arranged in three concentric rings around a central rod. The 24 pins in the outer ring have thorium-plutonium as fuel, and the 30 pins in the inner and middle rings have thorium-uranium-233 as fuel. The plutonium pins are placed in the outer ring to minimize the plutonium requirement. The thorium provides 60 percent of the reactor's power.

The reactor is designed [to last] 100 years. Present-generation reactors have a design life of about 40 years, and many of the reactors in the West have been extended beyond that. However, what goes inside the core of our advanced reactors will have a lifetime of [only] about 30 years, so the design includes replacement of the material twice in the life of the reactor, which can be carried out during normal annual shutdowns. The reactor is also designed for on-power fueling.

The reactor will produce 300 megawatts of electricity and 500 cubic meters per day of desalinated water for its own purposes.

The perennial challenge was to match the reactor's physics requirements with heat-removal requirements from the core. Physicists wanted to bring down the moderator use as low as possible, which meant the reactor had to be made very compact, with fuel rods being placed as close to one another as possible. The fuel rod spacing had to be reduced from the standard 270 millimeters to 245, and finally to 225 µm—something not attempted anywhere before. And that tremendously improved the performance of the reactor.

Another innovation was in differentially enriching the fuel [that is, boosting its fissile content] at the top and bottom of the central rod. The upper half has 2.5 percent enrichment; the lower half has 4 percent enrichment. This caused the power to jump from 230 MW to 300 MW.

Why was thorium not economical?
Sinha: Thorium has a much lower neutron multiplication rate than plutonium, and hence you cannot achieve power levels in a reactor as high as with plutonium. When burned, thorium initially acts like a blotting paper for neutrons and keeps absorbing them. But this exercise also means it is getting enriched and converted into U-233, which will pay dividends later on. Once the energy generated has reached 40 000 megawatt-days per metric ton, U-233 starts contributing many more neutrons than what has been lost in absorption by thorium. So you tend to get economic benefits of thorium if you have a fuel that can run up to 40 000 MWd/t and beyond. But most early generation reactors had lower burn-up values of around 15 000 to 20 000 MWd/t. These have, of course, risen to about 40 000 MWd/t in recent time. So the world is now thinking of thorium.

Progress towards a Helium atom microscope

Electron microscopes are great for magnification but they tend to destroy or damage what they are looking at. Similar magnification should be possible using a much lower-energy, gentler beam of helium atoms and recording how they are scattered by a sample. Up to this point only 1% of helium atoms can be reflected and focused from thin film silicon.

Vázquez de Parga and his team found they could avoid surface bumps by depositing the lead onto the silicon surface at low temperatures between -173 and -133°C. The end result is a perfectly smooth lead film that can act as an almost flawless mirror. The surface is atomically flat, more than 90% of the film is exactly the same thickness, down to the level of individual lead atoms. It can focus more than 15% of incoming helium atoms into a tight beam, and Vázquez de Parga hopes to increase this proportion to 40%.

Bill Allison at Cambridge University, UK, leads a team experimenting with thin silicon mirrors to focus beams of helium. "[This work] represents a key step forward in producing a device to focus helium atoms," he says.

"The remaining step is to combine the high reflectivity with a carefully deformed surface in order to create a focused atomic spot. That is still quite a challenge."

August 21, 2008

Carnival of Space Week 68

Crowlspace hosts the carnival of space week 68.

This site supplies an analysis of the technology that an interstellar capable civilization should have and an alternative to big antimatter rockets with mirror laser array propulsion

Centauri Dreams considers the difficulties of interstellar travel.


Crowlspace looks at antimatter propulsion.

Check out the carnival of space week 68 for a lot more.

Philip Moriarty discusses Molecular Nanotechnology Validation experiment plans

On the Center for Responsible Nanotechnology blog comments, Philip Moriarty discusses his plans for testing the viability of positionally-controlled atom-by-atom fabrication of diamondoid materials as described in the Freitas-Merkle minimal toolset theory paper.

A combination of low temperature tuning fork (Qplus) AFM, STM, and tunnelling spectroscopy (dI/dV and d2I/dV2, i.e. inelastic tunnelling spectroscopy) will be used to implement and/or characterise scanning probe-driven mechanosynthesis reactions on diamond (C(100)) in UHV and at temperatures in the 4 K - 300K range. Initially we will need to demonstrate atomic resolution on C(100). We will then explore some of the ideas in Freitas and Merkle's "minimal toolset" paper in order to extract hydrogen from a H-passivated C(100) surface and subsequently add a carbon dimer.

As regards verification, a key goal is for theory and experiment to run in parallel, one reinforcing the other. For example, we will aim to reproduce experimental force-distance spectra (measured as a tip approaches a diamond surface during a mechanosynthesis reaction) using DFT calculations. You ask whether the research includes "seeking out work-arounds". Yes, most definitely! There's an interesting quote from a recent international review of UK materials research that should be printed in bold capital letters on the front of all documentation produced by funding bodies, viz.: "Research is always about risk taking, no matter whether the risk involves failure to meet a certain set of expectations or failure to create truly new, significant knowledge or understanding of a problem. To be clear, if the outcome of the effort can be anticipated, it is highly questionable whether this effort should be called research."

When the project gets going I will aim to set up a blog that will report on progress.


Philip also had a comment about diamond versus graphene nanotechnology.

It's important to note that the diamond mechanosynthesis proposal focuses specifically on diamond and, indeed, on a particular challenge which I first raised in my debate with Chris Phoenix a few years back: scanning probe "epitaxy" of a row of carbon dimers using purely force-driven reactions on hydrogen-passivated diamond. Rob Freitas and Ralph Merkle's recent minimal toolset paper has been particularly important in defining the plan and objectives of the proposal and I want to stay focused on this, rather than move to graphene.

Graphene is, of course, a very interesting system and it's possible that we may explore this in the course of the five year mechanosynthesis grant. My suspicion, however, is that achieving basic "mechanoepitaxy" on diamond will take at the very least five years!

As regards [Jim Moore] points:

1. Being able to hold a sheet of graphene away from another surface may well be useful for longer term mechanosynthetic work but the primary objective of the EPSRC-funded work is to demonstrate the validity of a small number of mechanosynthesis reactions - which have been explored by Freitas and Merkle via DFT calculations using very many thousands of CPU hours - on a bulk diamond surface.

2. This is actually a rather challenging way of detecting a successful operation. It will be more straight-forward to use the scanning probe itself - through force-distance, I(V) and d2I/dV2 (inelastic) tunnelling spectroscopy - to monitor a successful mechanosynthesis reaction event.

3. The metastability of diamond with respect to graphite/graphene is not really an issue here. The H:C(100) surface represents an excellent platform for site specific scanning probe-driven chemistry. Drexler understood this very well - his choice of diamond(oid) in Nanosystems was very well-informed.

4. Hmmmm. Yes, graphene is certainly a well-funded area but it may not always be a good idea to chase current trends in order to secure money for research!

Stem cells for Unlimited Blood Supply Could Provide the money for Stem Cell Life Extension


Advanced Cell Technology is the company behind the new stem cell blood breakthrough which could supply unlimited disease free blood.
The company is only up 50% to a valuation of 6.7 million. Biotime is a related company

Robert Lanza is the chief scientist behind Advanced Cell Technology and was featured in Discover Magazine. He is also working on using stem cells for curing spinal injury and regenerating limbs and extending life span. Progess has been delayed by insufficient funding and regulations. With the blood breakthrough funding could be less of a problem.

We have cells that reverse paralysis in sheep that have spina bifida and can’t walk. After we injected our cells, the first animal that we treated returned to normal and was walking fine. The same model could work for paralyzed humans, but without funding, we haven’t been able to repeat the experiment in five years. People are in wheelchairs when there could be a cure.

We’re continuing [the work of harvesting embryonic stem cells from human clones], but with less urgency since the discovery of induced pluripotent stem cells, or iPS cells—adult cells that have been reprogrammed back to an embryonic state. We’re working on new ways to reprogram skin cells that would allow us to safely create a bank of stem cell lines that would closely match the population as a whole. It turns out that only 100 cell lines could give you a complete haplotype, or immune, match for 50 percent of the U.S. population. These reprogrammed cells are not as controversial since you don’t use cloning or embryo

Hemangioblasts and Life Extension
It turns out that the human life span plateaus as it approaches a roof of about 120. By eliminating infectious diseases, some chronic diseases, and cancer, we can get the life span past 100. I think with tissue engineering we can patch you together like a bicycle tire, replacing a kidney with a kidney and a heart with a heart, to about 120 years. That was always my thinking: That was the limit. But with these hemangioblasts, I now have questioned my own rules. These cells can go in and fix the damaged tissue inside, almost like nanoparticles. We may be able to do the same thing with similar cell lines for neurons, where we can repair the damage in the brain itself. So if it continues the way it’s going, we may break that ceiling, like breaking the sound barrier. I’d be very hesitant to put a lid as to where longevity is going to go.


We recently published a paper on a cell we created called a hemangioblast, which exists only transiently in the embryo but not in the adult. I think of them like unicorns, these elusive cells that we had hypothesized and sought for years. With the ability to become all of the blood cells—including your immune cells, red blood cells, all of your blood system, as well as vasculature—hemangioblasts have been biology’s holy grail. What we discovered is that we can create literally millions or billions of these from human embryonic stem cells. Now that we have them, we are harnessing, for the first time, one of nature’s early, most profoundly powerful cellular building blocks. The point is, we can use transient, intermediate cells like hemangioblasts as a toolbox to fix the adult so you don’t have to have limbs amputated, so you may not have to go blind, to prevent heart attacks. We can direct their development into different cell types by adding certain molecules to them as they divide.

Hemangioblasts can cut heart attack deaths in half
We found that when we injected these cells into a damaged, is­che­mic limb, there was almost 100 percent restoration of blood flow in a month. Before, the limb would have been amputated, but now it was restored. As to heart attack, injection of the cells cut the death rate in half.

Hemangioblasts can rebuild a fresh immune system
There are more than 80 autoimmune diseases. What’s interesting is that when you do a bone marrow transplant for cancer, some of those with autoimmune disease go into remission, as if the immune system has been eliminated and allowed to rebuild from scratch. Using hemangioblasts that are the progenitors of the immune system, we’re hoping we can replace the immune cells too.

Hemangioblasts equivalents for other kinds of cells
The way to think ofthis is that you have a tree with branches that give rise to all of the different tissue types of the body. The hemangioblast, for instance, gives rise to one branch—to blood cells, vessels, and the immune system. But there are also neural stem cells as well as early progenitors that have this plasticity in most of the other systems of the body. Right now we’re trying to discover how to isolate and expand them.

FURTHER READING
How much can life be extended

Hemangioblasts

The concept of the hemangioblast derives from the work of Florence Sabin in 1920. Her work on the development of chick embryos led her to propose the existence of an angioblast, or a vascular precursor cell . Later, work by Murray expanded on Sabin's work, noting that cells in the mesoderm (a region in the embryo where blood and early vasculature form) flattened to form endothelial cells (the interior lining of the blood vessel) at the same time as blood development.

From this evidence, Murray proposed that a common precursor existed. he termed this the hemangioblast. In a 2003 review, "Converging Roads: Evidence for the Adult Hemangioblast," research from the last eighty three years was summarized.

Aubrey de Grey Interview at betterhumans

Aubrey de Grey interview on longevity, life extension and Strategies for Engineered Negligible Senescence (SENS) at betterhumans.com

Estimate of the timing of results:
I [Aubrey] think there's a 50% chance of getting the first-generation SENS therapies working within 25-30 years. But that's only an estimate, and it's a highly speculative one: I think there's at least a 10% chance that we will hit so many unforeseen problems that we won't get there for 100 years. This is not something special about SENS, though: any technology that's two or more years away could easily be 100 years away.


Which anti-aging treatments soonest ?
..probably the closest is in fact not the enzyme therapy you mention, but the use of vaccines to eliminate extracellular aggregates (especially amyloid). But when we consider the others, actually I wouldn't like to make the call, because the hardest ones are the ones that the Methuselah Foundation and I are prioritising in terms of the early research. In other words, we're hoping that they will start to catch up with the easier ones. I suspect that the challenge of genetically modifying a high proportion of cells by somatic gene therapy will have been largely solved before we complete the development of all the genes that we want to introduce.


What will early anti-aging stem cell treatments be like ?
Stem cell treatments are mostly done with injections, but some of them (including ones for WILT) will be done from the outside, using technologies borrowed from cosmetic surgery for example. At first, these therapies will be quite experimental, so people will need to be monitored carefully for a while, but that will be a temporary phase. I don't think the patient will feel unusual or experience any change of appearance. There should be no perceptible “macro-changes” as a result of WILT—the idea is to stop cancers from getting large enough to be noticed at all.


SENS will upgrade over time
Because SENS has so many components, it’ll be undergoing enhancements continuously after its first version arrives. Most of these enhancements will be conceptually quite minor - one more enzyme to get rid of a slowly accumulating target junk molecule, one more cell type that we need to replenish because too many cells have died, etc. But you’re right; occasionally there will need to be more major enhancements. I don’t know in detail how we’d cope with non-specific mutation accumulation - if I did, I'd suggest that we deal with it now, just in case my analysis of the non-importance of those mutations in a currently normal lifetime is wrong - but I expect it’ll involve an increased use of tissue engineering for most tissues, and in those where that’s impractical (especially the brain) a very mild stimulation of cell death combined with cell replacement from stem cells.

French process to extract uranium from reactor ash

Areva and the University of Idaho have signed an agreement to develop technology for recovering uranium from incinerator ash at Areva's uranium fuel plant in Richland, Washington state. The process also reduces the amount of ash classified as radioactive waste.

Chien Wai, a chemistry professor at the University of Idaho, has developed a process that uses supercritical fluids to dissolve toxic metals. When this process is coupled with a purifying process developed in partnership with Sydney Koegler, an engineer with Areva and former student at the University of Idaho, enriched uranium can be recovered from the ashes of contaminated materials.

A supercritical fluid - in this case carbon dioxide (CO2) - is any substance raised to a temperature and pressure at which it exhibits properties of both a gas and a liquid. When supercritical, the substance can move directly into a solid like a gas, yet dissolve compounds like a liquid. CO2 reaches its supercritical state at a pressure of about 6.9 MPa and a temperature of 31°C. When the fluid's pressure is returned to normal, it becomes a gas and evaporates, leaving behind only the extracted compounds. Wai commented that supercritical CO2 has been used for decades to remove caffeine from whole coffee beans.

Areva plans to apply the process to recover uranium from 32 tonnes of ash at its Richland nuclear fuel plant. In addition to the recovery of two tonnes of uranium, the radiotoxicity of the post-process ash is reduced, thereby allowing some to be reclassified as other than low-level waste (LLW).

Construction of the ash-uranium recovery plant will begin in 2008 and should be operational in 2009. It will take about one year to process the 32 tonnes of ash at Richland, after which the plant could process ash from other LLW generators in the nuclear energy and nuclear medicine industries.

LLW from a nuclear plant







Waste type
Waste volume (cubic metres)

     Reprocessing
Once-through
LLW15,15220,060
ILW3611
HLW540



In other nuclear news, Canada is extending the operating life of the Gentilly 2 nuclear reactor until 2040

FURTHER READING
Other methods of handling low level waste


A large nuclear power plant that generates electricity for a half million people produces approximately 25 tons of spent fuel annually. An equivalent coal plant produces 10 million tons annually of air pollutants, potentially hazardous ash, and other wastes.

Nuclear waste discussed at Berkeley

Electromagnetic Pulse Risk not Total

The Wall Street Journal and other sources have been discussing the threat of an Electromagnetic Pulse attack on the USA.

The counter to these claims are that only 1% of the lights in Hawaii were effected by the Starfish Prime test (1.4 megaton thermonuclear weapon detonated 250 miles above Johnston Island in the Pacific in 1962).

The 1962 bomb affected street lamps, circuit breakers, cars and radio stations in Hawaiian, 800 miles to the north. Still, even there the effect was far from comprehensive. Los Alamos National Laboratory physicist Michael P. Bernardin said that "the 30 strings of failed streetlights [from Starfish Prime's EMP] represented only about one percent of the streetlamps on Oahu at the time." And noted physicist Richard Garwin said the Starfish detonation "had barely noticeable effects on military systems."

Starfish Prime is discussed at Wikipedia

Stanley Jakubiak's statement and research suggests that actual EMP damage would not be total

Testing of commercial off the shelf (COTS) equipment has allowed us to make some observations regarding the vulnerability of COTS equipment to a range of EMP environments that may be of some use in assessing the impact of an EMP environment on the unprotected commercial infrastructure. In general, it is possible that some equipment upset can occur when the EMP environment field strengths are between 3 – 8 kilovolts per meter (kV/m). When the field strengths reach above 8 kV/m the risk that some equipment will upset becomes more probable. In the range of 7 – 20 kV/m there is a possibility that some equipment will be damaged, above 20 kV/m damage is probable. Results from some recent testing of COTS computer equipment in September 1998 reconfirmed these observations.


So the Super-EMP threat is only credible at this time from Russia and China (20 megaton bombs that have amplified EMP effects that could have strong effects across a country). It is clear that EMP does cause problems and many places would have blackout issues if a large EMP (particular nuclear EMP devices) were used. Blackouts would be extensive but not total and full recovery would take time.

Systems hardening should be done but it should be performed with a prudent cost conscious upgrade plan.


There is vulnerability but the 90% death figure is an overblown threat assessment.

Many people have camping gear and would be able to heat water to purify. The obesity of many Americans would come into play to provide more time to prevent starvation. Only a fraction of even unprotected equipment will be disabled in terms of an attempt at nationwide disruption. There is shielded equipment that would not be effected.

There is a level of spare parts and hydro, coal and nuclear would be relatively easy to get back on line. There are some spare parts to get some level of water service back. It would be disruptive and a problem but there would not be a complete blackout. Plus there are old equipment at some older power plants and military
bases from the cold war days, or some buildings just may have some Faraday cage (electromagnetic shielding) type setup.

Since it would cost 1-5% to properly remediate, then there should be upgrades to key infrastructure against other vulnerabilities at the same time.

There is the Critical infrastructure protection program. (CIP)

The CIP researchers are aware of the issue and written papers on it. There is sufficient money going to the Departmennt of Homeland Security (DHS) and CIP. It is a matter of how much gets spent on pork and how much actually goes to fixing the problems. There have been about 2100 identified key installations that have been identified as needing protection. They should get some remediation over the next 5 years and for the military to get less complacent and back closer to cold war levels of pre-prep.

FURTHER READING
The EMP study (which may be making the issue seem bigger than it is and guiding more money than is needed at the problem) suggests:

The cost recommendations for decent levels of hardening of key aspects of the electrical grid and generation systems (Less than $3 billion for some decent protection.)

Only the costs for some of the larger or more system-specific initiatives are estimated here (in 2007 dollars).
- There are several thousand major transformers and other high-value components on the transmission grid. Protective relays and sensors for these components are more than that number but less than twice. A continual program of replacement and upgrade with EMP-hardened components will substantially reduce the cost attributable uniquely to EMP. Labor for installation is already a part of the industry work force. The estimated cost for add-on and EMP-hardened replacement units and EMP protection schemes is in the range of $250 million to $500 million.
- Approximately 5,000 generating plants of significance will need some form of added protection against EMP, particularly for their control systems. In some instances the
fix is quite inexpensive and in others it will require major replacements. The estimated cost is in the range of $100 million to $250 million.
- The addition of nonsynchronous interfaces to create subregion islands is not known with reasonable certainty, but it might be in the order of $100 million to $150 million per island. The pace of creating islands and their priority will be established by DHS in consultation with NERC and FERC. Moving to at least six or more fairly rapidly is a fair assumption. There will be annual operating costs of around $5 million per island.
- The simulation and training centers are assumed at three — one for each interconnect
— for a cost in the range of $100 million to $250 million plus annual operating costs of around $25 million per year.
- Protection of controls for emergency power supplies should not be too expensive since hard-wired manual start and run capability should be in place for many, which is adequate. Furthermore, the test, adjust, and verification will be carried out by the entity that owns the emergency power supply as part of normal operating procedures. Retrofit of protective devices such as filters might be accomplished at a cost of less than $30,000 per generator for newer generators with vulnerable electronic controls. Hardening the connection to the rest of the facility power system requires a protected internal distribution system from the backup generator.
- Switchable ground resistors for high-value transformers are estimated to cost in the range of $75 million to $150 million.
- The addition of new black start generation with system integration
and protected controls is estimated to cost around $12 million per installation. Probably no more than 150 such installations will need to be added throughout the United States and Canadian provinces. Adding dual fuel capability to natural gas-fired generation is done for the economic purpose of the owner, yet it has the same value as the addition of black start generation. The addition of fuel storage for the existing black start units is relatively small, about $1 million each.
- The addition of emergency generation at the multitude of sites including fuel and transportation sites is probably around $2 million to $5 million each.
- The cost for monitoring, on a continuous basis, the state of the electric infrastructure, its topology, and key elements plus for assessing the actual EMP vulnerability, validation of mitigation and protection, maintenance, and surveillance data for the system at large cannot be estimated since it falls under many existing government-funded activities, but in any event, it is not considered significant.
- Research and development activities are a level-of-effort funding that needs to be decided by DHS. Redirection of existing funding is also likely to occur.
- Funding for the initiatives above is to be divided between industry
and government. Government is responsible for those activities that relate directly
and uniquely to the purpose of assuring continuation of the necessary functioning of U.S. society in the face of an EMP attack or other broadly targeted physical or information systems attack. Industry is responsible for all other activities including reliability, efficiency and commercial interests. Industry is also the best source for advice on cost effective implementation of the initiatives.

No cost is quoted, but $1 billion each in preventative hardening of key water, food and transportation.

The unclassified DOD report on the starfish prime test

1964 Nasa report on high altitude EMP

Personal Protection Steps Against EMP

What can your everyday civilian do to protect themselves against the possibility of an EMT attack?

1. Have a lot of battery operated devices on hand and the batteries to use them. Further, these appliances should have cords and antennas 30 inches or less in length. The reason for this is simple: Metal pulls in EMP and makes it more dangerous. Thus, less metal is good. Further, keep these appliances away from metal.

2. Stay 8 feet from large scale metal fixtures yourself. In fact, when EMP is concentrated by metal it can actually be dangerous to man in and of itself.

3. Harden your equipment (another way of saying, protect it from EMP). Some considerations include the use of tree formation circuits (not standard loop formations), induction shielding around components, self-contained battery packs, loop antennas, and Zener diodes. In addition, grounding wires for each separate instrument into a system could help as well.

4. A new device called the Ovonic Threshold Device (Energy Conversion Devices of Troy, MI) is a solid state switch that opens a path to ground when a massive surge of EMP is encountered by a circuit. This would help in a big way.

5. Use a Faraday Box to store equipment in. Makeshift Faraday boxes can be made from metal filing cabinets, ammunition containers, and cake boxes. That said, the device you are protecting must not touch the metal container (use insulation: paper, cardboard, whatever). Further, there can be no holes. Last, if the box seems less than adequate, you may wrap it in aluminum foil for more protection.

6. Wrap your rooms in aluminum foil. Well, it's certainly extreme, but thought it worth mentioning. After you do so, cover it with some type of fake wood, etc.
[Some drywall boards have a metal sheet, so select such boards when remodelling]

7. Cars are already a metal box. Thus, most of them would survive. That said, gas would be a problem. So have a lot of that and food on hand (remember that refrigerators and water sanitizing devices would go out).

Only the EMP from a near hit surface burst can cause trouble for hardened silo.
156 kA, 56 kA secondary peak. Lightning up to 1km distance generated from a 10 megaton blast.

The peak electric field from Starfish prime (1.4 megaton blast) on
Honolulu would have been 5.6 kV/m energy density 0.01j/m**2 [This is 10% of the worst case field, so there is high variability in the EMP effect]

The US would be able to launch nukes from silos, bombers and submarines after any EMP attack.

August 20, 2008

Stem cell blood supply breakthrough and other stem cell breakthroughs

Human blood have been grown from embryonic stem cells for the first time during research that promises to provide an almost limitless supply suitable for transfusion into any patient. This could lead to trials of the blood within two years, and ultimately to an alternative to donations that would transform medicine.

If such blood was made from stem cells of the O negative blood type, which is compatible with every blood group but is often in short supply, it could be given safely to anybody who needs a transfusion. Stem-cell-derived blood would also eliminate the risk of transmitting the pathogens that cause hepatitis, HIV and Creutzfeldt-Jakob disease (CJD) through transfusions.

In other stem cell news, Massachusetts General Hospital (MGH) investigators have found that infusions of a particular bone marrow stem cell appeared to protect gastrointestinal tissue from autoimmune attack in a mouse model.

A method of growing embryonic human stem cells in the lab that uses no animal-derived materials has been developed – an important advance in the use of hESCs for future medical purposes.

Menstrual stem cells were used to rejuvenate damaged limbs and prevent the need for amputation.

The research also has more immediate clinical promise for efforts to turn embryonic stem cells into other types of tissue, to treat conditions such as diabetes and Parkinson’s.

One of the biggest safety hurdles that must be cleared before stem-cell therapies enter clinical trials is the risk of uncontrolled cell growth causing cancer. Red blood cells, however, do not have nuclei that carry the genetic material that goes wrong in cancer, and thus should not present this danger. “This could be one of the biggest breaks for the early clinical application of embryonic stem cells,” Dr Lanza said. “There is still work to be done, but we could certainly be studying these cells clinically within the next year or two.”

Conservative biofuel production forecast to 2017


Biofuel projection from now until 2017 from the (Food and Agricultural Policy Research Institute) Fapri 2008 Agricultural outlook

The USA and Brazil produce about 70% of the worlds biofuel.

This projection is based on conventional non-cellulose, biofuel from waste and non-algae biofuel production. Therefore, it seems likely to be a vast underestimate of the actual amount of biofuel.

IEA oil market report

The August 12th IEA oil market report shows global oil supply increased by 890 Kb/d in July to 87.8 Mb/d. Norway, Canada, Argentina and Brazil underpinned non-OPEC growth of 520 Kb/d, amid a lull in seasonal maintenance elsewhere. Growth in non-OPEC output now averages 455 Kb/d for 2008 and 665 Kb/d for 2009, after 425 Kb/d in 2007.

OPEC July crude supplies rose by 145 Kb/d to 32.8 Nb/d. Nigeria, Saudi Arabia and Iran all saw higher output, although over 0.5 Nb/d remains shuttered in Nigeria. Effective OPEC spare capacity is 1.5 Mb/d, but should rise by end-2008 and through 2009.

Global oil product demand for 2008 remains unchanged at 86.9 Mb/d (+0.9% or 0.8 Mb/d versus 2007). Demand in 2009 is nudged up 70 Kb/d to 87.8 Mb/d (+1.1% or 0.9 Mb/d versus 2008). Growth is driven by projected non-OECD demand, largely unchanged at 38.3 Mb/d in 2008 and 39.7 Mb/d in 2009.

Japanese Sake Brewer makes Cellulose Ethanol Breakthrough

Major Japanese sake manufacturer Gekkeikan announced on March 28, 2008, that it has developed a new technology for producing bioethanol, which is attracting keen interest as a replacement for fossil fuels. Using "super yeast" -- sake yeast genetically modified with koji mold genes -- the innovative technology can directly produce ethanol from inedible plant materials such as paddy straw and chaff.

Plant cellulose, a raw material for bioethanol, has a chemically stable, robust structure. The new technology pretreats plants to weaken their rigid structure and make them ready for fermentation using a "subcritical water treatment." The pretreatment process uses water at high temperature and pressure in a subcritical state, and therefore, it is safer and has less environmental impacts than existing methods that use chemical agents.

Super yeast that produces alcohol is created by integrating koji mold genes that produce cellulolytic enzymes into sake yeast so that the enzymes are densely displayed on the surfaces of the yeast cells, much like the hands of a multi-armed deity. Since super yeast has the functions of koji mold, it achieves one-step production of ethanol from pretreated cellulose.

FURTHER READING
Cellulosic ethanol at wikipedia

According to U.S. Department of Energy studies conducted by the Argonne Laboratories of the University of Chicago, one of the benefits of cellulosic ethanol is that it reduces greenhouse gas emissions (GHG) by 85% over reformulated gasoline. Switchgrass and Miscanthus are the major biomass materials being studied today, due to high levels of cellulose. Cellulose, however, is contained in nearly every natural, free-growing plant, tree, and bush, in meadows, forests, and fields all over the world without agricultural effort or cost needed to make it grow.

The U.S. could potentially produce 1.3 billion dry tons of cellulosic biomass per year, which has the energy content of four billion barrels of crude oil. This translates to 65% of American oil consumption

Interstellar space travel prediction needs comparable advances in all areas

There has been recent discussion of a possible way to travel faster than light using dark energy, but that to move a 1000 cubic meter object it would take 10**45 joules (convert Jupiter into energy).

However, it makes no sense to assume being able to convert a planetary mass into energy without having increased control of technology and information and increased economy. It is like assuming a group of cavemen get the designs for a supersonic plane but only have the economy of their tribe of six to fund it. The assumptions would also be that they need to transport their rock caves and the woolly mammoths and buffalo herds that they hunt.

A civilization able to convert Jupiter into energy would have near the physical limits of computing and all other technologies. Getting towards the ultimate limits of computing with slow atomic ballistic cooled computronium. (theoretical 1 m/s coolant) flux 10 ** 26 bits/s cm**2. 100 trillion trillion operations per second in a sugar cube. Full blown diamondoid nanotechnology would be developed, which was recently had experiments funded.

Having a smaller sized cube for the faster than light system would also reduce the power requirements by one million to one billion times (instead of 1000 cubic meters).

The engineering of the casimir force and the control of virtual particles in the vacuum.

A recent antimatter powered rocket design by Robert Frisbee would still take nearly 40 years to travel the 4.3 light years to Earth's nearest neighbor, Alpha Centuri.

Frisbee presented a theoretical design for a ship using antimatter to propel its way to nearby stars. Frisbee's design calls for a long, needle-like spaceship with each component stacked in line to keep radiation from the engines from harming sensitive equipment or people.

At the rocket end, a large superconducting magnet would direct the stream of particles created by annihilating hydrogen and antihydrogen. A regular nozzle could not be used, even if made of exotic materials, because it could not withstand exposure to the high-energy particles, Frisbee said. A heavy shield would protect the rest of the ship from the radiation produced by the reaction.

A large radiator would be placed next in line to dissipate all the heat produced by the engine, followed by the storage compartments for the hydrogen and antihydrogen. Because antihydrogen would be annihilated if it touched the walls of any vessel, Frisbee's design stores the two components as ice at one degree above absolute zero.

The systems needed to run the spacecraft come after the propellant tanks, followed by the payload. In its entirety, the spaceship would resemble a large needle massing 80 million metric tons with another 40 million metric tons each of hydrogen and antihydrogen. In contrast, the Space Shuttle weighs in at a mere 2,000 metric tons.


Even for a civilization with mature antimatter propulsion technology. The lifespan of humans would have been drastically extended. 40 years on a space ship would be nothing to being who do not age and especially if it was artificial intelligences operating the computronium.

There is also better and faster space propulsion designs using beamed power. Photonic laser propulsion using laser arrays and mirrors to bounce the beam and economize on power.

Gasoline produced from biomass in cars by 2010, 2.5 billion gallons/year by 2022

Byogy has licensed the University of Texas A&M process and hopes to have a plant using the technology up and running within 18 months to two years. The intent is to have raw garbage going in one end of the plant and 95-octane gasoline coming out the other.

"Our goal with this technology is to achieve as much as a 2 percent contribution to the nation¿s gasoline demand by 2022 through the building of 200 more bio-refineries," said Benjamin J. Brant, President and Chief Technology Officer of Byogy. "We firmly believe the TEES technology combined with the Byogy team offers this possibility."

The focus at the initial plant would be on using urban waste, which the plant would grind, sort and then convert into gasoline. The fuel produced by this process could immediately be used as a drop-in substitute to the current petroleum gasoline supplies with a seamless integration into the existing fuel distribution infrastructure. Nothing needs to be changed at retail gas stations, pipelines, regional fuel terminals or in any motor vehicle.


"Our plan is to produce two-and-a-half billion gallons or more of carbon neutral renewable gasoline per year, said Daniel L. Rudnick, Chief Executive Officer of Byogy. We are positioning ourselves not only to handle the opportunity biomass waste streams that are available today, but also the sustainable biomass energy crops of the future. This green substitute for conventional gasoline is the Holy Grail of all biofuels."

FURTHER READING
Byogy website

Feeding algae carbon dioxide and organic material could boost the oil yield 40 times

Most previous and current research on algae biofuel, she said, has used the algae in a manner similar to its natural state — essentially letting it grow in water with just the naturally occurring inputs of atmospheric carbon dioxide and sunlight. This approach results in a rather low yield of oil — about 1 percent by weight of the algae.


The University of Virginia team hypothesizes that feeding the algae more carbon dioxide and organic material could boost the oil yield to as much as 40 percent by weight.


Proving that the algae can thrive with increased inputs of either carbon dioxide or untreated sewage solids will confirm its industrial ecology possibilities — to help with wastewater treatment, where dealing with solids is one of the most expensive challenges, or to reduce emissions of carbon dioxide, such as coal power-plant flue gas, which contains about 10 to 30 times as much carbon dioxide as normal air.

Algae are tiny biological factories that use photosynthesis to transform carbon dioxide and sunlight into energy so efficiently that they can double their weight several times a day.

As part of the photosynthesis process algae produce oil and can generate 15 times more oil per acre than other plants used for biofuels, such as corn and switchgrass. Algae can grow in salt water, freshwater or even contaminated water, at sea or in ponds, and on land not suitable for food production.

On top of those advantages, algae — at least in theory — should grow even better when fed extra carbon dioxide (the main greenhouse gas) and organic material like sewage. If so, algae could produce biofuel while cleaning up other problems.

August 18, 2008

Detailed analysis of salamander regeneration

Technology Review discusses the work of Gerald Pao, a postdoctoral researcher at the Salk Institute for Biological Studies, to perform a detailed genetic analysis of the axolotl salamander's DNA and the precise molecular processes of successful regeneration.

Pao and his collaborators won one billion bases' worth of free sequencing from Roche Applied Science, based in Indianapolis. Now that the data is in, scientists can finally begin the hunt for the genetic program that endows the animal with its unique capabilities.

Researchers hope that by uncovering these molecular tricks, they can ultimately apply them to humans to regrow damaged heart or brain tissue, and maybe even grow new limbs.

In order to quickly identify sections of the salamander's genome involved in regeneration, the scientists sequenced genes that were most highly expressed during limb-bud formation and growth. They found that at least 10,000 genes were transcribed during regeneration. Approximately 9,000 of those seem to have related human versions, but there appear to be a few thousand more that don't resemble known genes. "We think many of them are genes that evolved uniquely in salamanders to help with this process," says Randal Voss, a biologist at the University of Kentucky, who is working on the project.


The researchers now plan to make a gene chip designed to detect levels of some of these candidate genes, so that the scientists can determine at exactly what point during the regeneration process the genes are turned on. The team is also developing molecular tools that allow them to silence specific genes, which will enable them to pinpoint those that are crucial for proper regrowth.

Scientists also sequenced random chunks of the salamander genome. At about 30 billion bases and 10 times the size of the human genome, it is one of the largest among vertebrates. Most scientists expected that the extra DNA would be made up of junk DNA, long stretches of bases between genes. But initial findings were surprising. "Genes are on average 5 to 10 times larger than those in other vertebrates," says Voss. "The region of the genome containing genes is estimated to be more than two gigabases, which is as big as some entire genomes."

"If we come up with some totally unique gene only present in axolotl, that would make it really hard to replicate," says David Gardiner, a biologist at the University of California, Irvine, who is also collaborating on the project. He prefers to think that regeneration comes from a fundamental abilitylying dormant in mammals, which could be reawakened with some simple genetic prodding."Most of the tissue in our arm regenerates; it's just the arm that doesn't regenerate," he says. "What's missing is how you coordinate a response to get an integrated structure."

Japan's large scale uranium from seawater and superconducting wire plans

Japan considering Using gene engineered seaweed to get million of tons of Uranium
The Mitsubishi Research Institute (MRI) has recently recommended Japan mass-culture seaweed to collect natural resources such as bio-ethanol and uranium. In the “Apollo and Poseidon Initiative 2025,” MRI suggests that Japan cultures gulfweed, which can grow more than 2 metres high a year in the sea. The plants could also absorb carbon dioxide and purify the seawater, and can be used as non-food alternative energy sources for bio-ethanol. In April, MRI plans to inaugurate a consortium comprising public research institutes and manufacturers to move the plan forward. Using advanced molecular and gene-engineering technologies, MRI estimates that Japan would be capable of producing 65 million metric tons of gulfweed a year, and recovering a resource of 195 million tons of uranium. The annual rate of recovery is 40% of Japan’s total consumption. (19 February 2008, Nikkan Kogyo Shimbun)


The last part of the quoted paragraph is somewhat confusing as noted in the comments. Using polymers, the total amount of uranium recovered from three collection boxes containing 350 kg of fabric was >1 kg of yellowcake after 240 days of submersion in the ocean. So 65 million tons of seaweed might get 195,000 tons/year of uranium based on a comparable efficiency.

Abstract of the polymer recovery work.

The secondary alternative is that 195 million tons of uranium is recoverable eventually based on the Japanese plans of accessing the 4.6 billion tons in the Oceans. They are probably planning to tap an ocean current off of Japan. Japan uses 7589 tons of Uranium per year now. 40% of Japan's consumption would be about 3000 tons of Uranium per year. So there is the reserve amount recoverable and the annual production based on the planned scale of the initial operation.

Note: Current conventional uranium reserves are 5.5 million tons. There is another 10 million tons expected to be developed in the same geological formations. There is 22 million tons of Uranium in phosphate reserves. So 195 million tons would be a lot, but it would only be part of the 4.6 billion tons in the oceans. 65,000 tons per year of uranium are used worldwide now.

This is related to the article on this site that uranium from seawater and breeder reactors would enable Uranium to power nuclear reactors for tens of thousands or billions of years depending upon the rate of usage.

Demonstration of superconducting cables at a substation in 2010
Sumitomo Electric Industries and Tokyo Electric Power will test superconducting cables connecting to the power system at a substation for a year in autumn 2010. They will demonstrate high-temperature superconducting cables that are cooled by liquefied nitrogen at 196°C below zero. This technology costs less than low-temperature cables that need coolant of minus 269°C. It is expected that superconducting cables will be put into practical use around 2020 and cut power transmission costs by 40% in the future. (14 February 2008, Nikkan Kogyo Shimbun)



Sumitomo Electric Industries can make 15 meters per hour of superconducting wire which is above the 10 meter per hour needed for practical commercial applications.

To mass-produce carbon fibre components for automotive
Japanese carbon fibre producers will start the mass-production of automotive components as early as 2010. Carbon fibre resin are ten-times stronger and four-times lighter (but more expensive) than steel products. These companies, including Toray Teijin and Mitsubishi Chemical, expect to make up the cost gap with mass-production and automotive companies’ growing needs to response to tighter environmental regulations in industrialised nations. It is said that these advanced materials can make vehicles 10% lighter and as a result improve fuel efficiency by 4-5% when applied to major components. (29 February 2008, Nikkei Shimbun)


This is an intermediate step to reducing vehicle weight by 40% with carbon fibre and increasing fuel efficiency by 30%.

Reduction in waste concrete from nuclear power plants
The Japan Atomic Energy Agency (JAEA) and general contractor Kumagai Gumi have jointly developed low-cost radiation shielding concrete for nuclear power plants. The concrete has a structure of three layers: low neutron activation concrete; concrete containing boron, which is capable of absorbing neutron; and ordinary concrete. The developed concrete uses half as much boron as conventional concrete for blocking neutron does and as a result its estimated cost is 10% to 50% of that of conventional ones. Moreover, at the end of the lifetime of a plant, the amount of radioactive concrete will be half since only the first and second layers are radioactive. [Therefore, the decommissioning costs would be lower] (8 July 2008, Nikkei Sangyo Shimbun)


Low-cost separation membrane for hydrogen production
Nippon Seisen has developed a membrane capable of separating high-purity hydrogen from natural gas. The 15μm-thick paradigm alloy membrane produces hydrogen of over 99.9999% purity, requiring no mechanical devices to remove impurity. Its cost for hydrogen production is 25% of that of conventional technologies. They will market it for hydrogen production for residential fuel cells and fuel cell vehicles as early as 2009. (9 July 2008, Nikkei Sangyo Shimbun)


Matsushita attempts to commercialise large scale organic EL TVs
Matsushita will establish a prototype production line and achieve mass production technology aimed at commercialising 40-inch TVs. Since Matsushita signed the license agreement with the US Company, Kennedy Display Technology, Matsushita has launched development at the Semiconductor Research Laboratory in Kyoto. Matsushita will increase the number of engineers, who are especially committed to development, up to 200 and also recruit experts on organic EL from outside (29 July 2008, Nikkei Shimbun).

Starting mass-production of bio-plastics
Mitsubishi Chemical will embark on mass-production of synthetic resins using plants as a raw material. The company plan to use sugar from potato for biodegradable plastic and plant-origin starch for polycarbonate. For the biodegradable one, they will build a plant capable of producing 10,000t per year in 2010 and expand it to a 100,000t scale as early as 2015. As to polycarbonate, they will build a test plant as early as 2009 and investigate feasibility of commercialisation in 2010 or later. (18 July 2008, Nikkei Shimbun)


FURTHER READING
More reports of Japanese technology from the British Embassy in Japan

Solutions that are cared about most poll results


The last poll results are here

The top results were:

Improve humanity (safe singularity) 59 out of 284 (21%)
Effective anti-aging treatment 56 out of 284 (20%)
Stronger economy (triple growth or more) 55 out of 284 (19%)
Peak Oil solutions 32 out of 284 (11%)
Reduce CO2 from transportation and energy 20 out of 284 (7%)
Cure disease (cancer, heart disease etc) 17 out of 284 (6%)
Poverty reduction 17 out of 284 (6%)
Other in comments 13 out of 284 (5%)
Store CO2 in Cement or sequester 7 out of 284 (2%)
Prevent nuclear weapon usage 3 out of 284 (1%)
Starvation prevention 3 out of 284 (1%)
Reduce nuclear weapon damage 2 out of 284 (1%)

August 17, 2008

Further improvement of buildings for more resistance to nuclear bomb effects

This a follow up to prior article about re-inventing civil defense using simple and affordable defenses for residential buildings, such as better nails (hurriquake nails which you can buy from amazon.com.)

This is not a plan to make buildings nuclear blast proof, but a lot more blast resistant. A direct hit would be too tough to build against and the result would be command bunker like. A direct hit has the fireball to deal with and a lot more localized destructive force. However, farther away there are destructive forces which can be relatively easy to resist with improved construction.

The 5 PSI level is just better nails. The building looks the same in every other way. Existing steel reinforced buildings can resist up to 10PSI already.

The good thing about building walls staying up is that no natural gas lines are cut so that there are a lot fewer fires.

First survive the blast and other immediate effects, so that you are in best shape possible afterwards. It is a lot tougher to survive if you have to dig out from a collapsed building or have been injured. If the walls stay up then those walls also help protect against the heat and the radiation. It is better to have the walls take some of the hit instead of your clothes and skin. If you have survived the initial explosion then you are alive to get away from the fallout (move perpendicular from the direction of the wind if the wind is blowing towards you from the blast).


The blast, thermal and radiation can all have lethality reduced with better walls. Radiation lethality can be further reduced with better anti-radiation drugs [James Tour, Rice University is testing drugs 5000 times better).

There is analysis of the deaths from the different causes at different distances. (heat, ionizing radiation, fallout and blast, secondary effects like fires). Blast overpressure goes out the farthest for smaller bombs and thermal for larger bombs.


Note: 500 kilotons is the maximum for nuclear fission bombs. Megaton bombs are nuclear fusion bombs that are triggered by nuclear fission explosions. Getting up to the more powerful bombs involves more research and testing of designs. Countries or groups that are testing bomb designs make that effort totally obvious on seismometers. (There would then be no sneak attack scenario, because it would be known that another group has advanced bombs.)

Many hospitals need to be rebuilt more bunker like in terms of being strong monolithic domes. There is some price to pay in terms of aesthetics for disaster preparation.

Doors and Windows
Even if doors and windows are weaker than the walls and the door and windows get blown in, that is still better than having both the walls and the doors and windows fail. However, doors and windows can be made more blast resistant.

50 PSI door is here. It is a steel door which has concrete poured into it. A somewhat cheaper version of the 50PSI door could be built with layers of cellulostic nanopaper (wood handled in the processing so that fibers are not damaged) that is almost as strong as steel and filled with iCrete (the 14000PSI concrete used in New York now.)

There are 20PSI resistant doors and windows that do not look oppressive and 10 PSI can be transparent.




Currently high rise buildings can resist 10-15PSI. So better steel and concrete which is being used would already help. iCrete is not the strongest concrete. Adding quartz and steel aggregate can increase the strength by 3 times.

Some glass can resist 40 PSI and polymers can help walls resist 80PSI.

Thin films of polycarbonate laminated on glass, for example, will keep the shattered glass in one cohesive (though shattered) piece. An alternative is thermally tempered glass (TTG), which can protect against pressures up to about 40 psi. TTG, also used in automobile windows, fractures into rock-salt size pieces which are not as dangerous to building occupants.

An elastomeric polymer has been tested on an eight foot by eight foot concrete wall, which was sprayed inside and out with the polymer and then subjected to 80+ psi blast pressures. The wall experienced severe fracturing but remained in place with no fragmentation. A follow-on activity identified additional polymers (e.g. polyurea) that may have better qualities to decrease wall deflections. The testing continued to show promising results with stand-off distances reduced over
non-sprayed lightweight structures by as much as 50%. Furthermore, polymer foams can be inserted inside walls to act as an energy absorber, thus reducing the severity of a blast inside a structure.


40-80 PSI resistant retrofits of existing buildings and structures and new construction is possible with a little research and development. The R&D would not be so much to make it possible, since the basic principles exist already as mentioned above. The R&D would be to make it cheaper to do it on a large scale and to perform the computer simulations to ensure that the modifications will have the desired results. 40-80PSI resistance makes the destructive blast radius over ten times less than 5 PSI construction.

I am talking about saving the lives and resisting the damage in the outer areas of blasts and working inwards as more effort is made. The technology is here to make all buildings (homes and office buildings) a lot more blast resistant while not greatly altering appearance and aesthetics and being affordable. People can choose not to do it, just like some people choose not to fasten seatbelts in a car or choose to drive older cars without airbags. Cars were made more accident survivable. Making events that were not survivable, more survivable is a good thing.

The Opposite of Preparation and the Mindshift in thinking on Nuclear Bombs
Many people have two reactions to the idea of surviving a nuclear bomb or nuclear war.

1. Some would prefer to not survive the initial blast
2. Some consider it an affront to consider making nuclear war survivable and that the only acceptable strategy is to avoid all nuclear war and to allow the devastation to be maximized.

It is an option to make a house more survivable. If someone does not want to survive a nuclear blast then that is an easy task. Get outside at the first indication that a nuclear blast is happening.

If you do not want to survive: do not use seatbelts in your car or have airbags so that you can have less chance of surviving your car accidents. You house should be a lean-to shelter, so that suports are easily knocked out.

Earthquake preparedness, recommends having some stockpiles of bottled water and canned food. If you do not want to survive: you'll want to make sure that you do not have any of that around.

The only way some would want to survive is if the government sent someone to rescue them and then if they did not then they could happily complain that it was like Katrina. You would not want to make anything easier for potential rescuers. You want to blame them for not helping unprepared people, who wanted to die in the initial blast but unfortunately survived.

Buildings have not yet been made to the standard that is suggested. If they were then nails are just the first step. Further reinforcement is possible and the better anti-radiation drugs that are being tested now should be developed and distributed. 40-80 PSI resistant commercial and office buildings would mean one tenth the radius of major damage. A circle with one tenth the radius is one hundred times smaller. So if the plan was followed if in ten years buildings were reinforced to that standard then the hospitals would be standing and the deaths inside buildings would be in the hundreds instead of tens of thousands. For those that do not want to survive: they will be happy to know that there still be more deaths from those outside during the blast.

For those who feel that the nuclear war should not be survivable as part of a strategy to maximize nuclear war prevention, then we should set up a system of universally electronically activated kill switches. This can help maximize the casualties in the event of any violence or conflict. So that there will be the gamble of everyone surviving without war or everyone dieing. Make war unsurvivable, if that is the better strategy.

Electro magnetic pulse
The electromagnetic pulse has different ranges based on the size of the bomb and on the elevation. A 20 megaton bomb blown at a high altitude could effect the entire United States

- Harden your equipment (another way of saying, protect it from EMP). Some considerations include the use of tree formation circuits (not standard loop formations), induction shielding around components, self-contained battery packs, loop antennas, and Zener diodes. In addition, grounding wires for each separate instrument into a system could help as well.

- A new device called the Ovonic Threshold Device (Energy Conversion Devices of Troy, MI) is a solid state switch that opens a path to ground when a massive surge of EMP is encountered by a circuit. This would help in a big way.

- Use a Faraday Box to store equipment in. Makeshift Faraday boxes can be made from metal filing cabinets, ammunition containers, and cake boxes. That said, the device you are protecting must not touch the metal container (use insulation: paper, cardboard, whatever). Further, there can be no holes. Last, if the box seems less than adequate, you may wrap it in aluminum foil for more protection.

- Wrap your rooms in aluminum foil. Well, it's certainly extreme, but thought it worth mentioning. After you do so, cover it with some type of fake wood, etc.

- Cars are already a metal box. Thus, most of them would survive. That said, gas would be a problem. So have a lot of that and food on hand (remember that refrigerators and water sanitizing devices would go out).