October 05, 2007

Hillary Clinton on Science

Hillary Clinton has given a speech that outlines her position on stem cells (for lifting research bans), space (more for space science not for manned Mars and moon), and for having directors of science related programs ensure that politics is kept out of science decisions.

This is relevant because Hillary Clinton is currently leading the democratic candidates in polling by a wide margin, she has the most funds raised of any candidate Republican or Democrat, and she has what is widely acknowledged as the best and most disciplined political machine.

Not really going out on much of a limb: I predict that she will win the US presidency in 2008.

Related to this is a prediction market tracking indicating 58-60% chances of a Democrat becoming president in 2008

At tradesports on Oct 5, 2007, Hillary will win president is 42.7% They have Obama at 8.2%, Al Gore at 7.7% and Edwards at 2.6%. If we presume that Hillary is a Democratic nomination lock then she would move to 61%.

I support space development, but I do not mind the fact that she will probably cut the manned Mars and Moon programs. The programs, as they are currently planned, are misguided.

1) The current human Moon and Mars plan look like a lot money for not much return

The total funding of Project Constellation through 2025, inflation-adjusted and without any other increases to NASA's budget, is estimated at $210 billion; the ESAS estimates the cost of the program through that date at being only $7 billion more, at $217 billion.

Project Constellation is a NASA program to create a new generation of spacecraft for human spaceflight, consisting primarily of the Ares I and Ares V launch vehicles, the Orion crew capsule, the Earth Departure Stage and the Lunar Surface Access Module. These spacecraft will be capable of performing a variety of missions, from Space Station resupply to lunar landings. so the $210-217 billion estimates is just the start and does not get the humans to Mars part of the plan.

The International space Station was also a lot of money ($100+billion) for very little return. So the purpose of space station resupply is not useful either.

All of the pieces pretty much duplicate existing rocket capability.

2) Nasa should not build and operate the hardware and the missions.

The US government does not build the highway system, and the cars and operate all of the vehicles. Government shoulder the risky development of new technology. Like the first two nuclear propulsion rockets or super-ion drive tugs, then license the technology to private industry.

3) We could start paying just for lunar and orbital development without this hardware. If we make new space hardware then it should be for hardware that really makes a substantial improvement in cost and performance.

As I have noted in my proposed plan to win the Google lunar landing challenge, Nasa could use existing rockets and launch large robotic missions to the Moon starting in 2008-2012.

-Build a large lunar lander (lunar orbit to the surface), repeating old tech but scaled up
-Build the earth orbit to lunar transfer tug
-Build the robots and systems that you want to land on the moon.
-Use an existing rocket to take it to earth orbit, with about 20 tons
-Use low energy transfer to get it from earth orbit to the moon, takes 5 months but uses very little fuel by weight of the vehicle. 10% of the vehicle weight from earth orbit
-Use the large lunar lander to deliver 5-15 tons of robots and equipment to the moon with each $70-200 million trip.

Using about $3 billion per year. There could be six missions to the moon each year with $500 million per mission. $200 million for the trip. $300 million for the gear and the mission.

The purpose would be to land things to build up what would become permanent infrastructure on the moon and in earth and lunar orbit. Power generation systems and systems for processing material and building facilities (telescopes, pre-deployed moon bases.)

Yes, Hillary and most of the presidential candidates have talked about nuclear power.

Hillary clinton is one of eleven co-sponsors on the climate stewardship bill. Which the EIA has projected would nearly triple the amount of nuclear energy by 2030. I view this is a good thing, since it would reduce coal usage and thus air pollution and save tens of thousand of American lives per year when fully implemented.

I had coverage of the position of the presidential candidates on nuclear

There is a video of Hillary talking about nuclear and energy

Nuclear power, about which Hillary says she is "agnostic," has been neglected for so long in this country (it only supplies 8% of our total energy needs) that it cannot be part of anything but a long-term solution.

Skewing the odds for possible superior genetic results

there has been a fair amount of discussion about the idea of genetically screening 1000 eggs or embryos in order to get genetically gifted populations (1000 Einsteins).

My past cognitive enhancement survey from 2006 for using drugs, genetics, computers and other methods for cognitive enhancement. It is not all about genes.

Not all of the genes are known, but there is growing base of knowledge of positive and negative genes for intelligence. We are getting a growing ability to successfully shift the odds of desired genetic results across populations. There is not really genetic determinism but genetics affecting the likelihood of good or bad results.

List of notable human genes

CHRM2 positive correlation with increased intelligence

Recessive alleles around at about 325 loci increase mental retardation

Mental retardation genes AGTR2

ZDHHC9 gene found (severe mental retardation)

Intelligence enhancement genes discussed at

Scientists reverse mental retardation in mice (FMR1, FRMP)

Proper Training is very important to take advantage of genetic abilities.

Deliberate practice entails more than simply repeating a task — playing a C-minor scale 100 times, for instance, or hitting tennis serves until your shoulder pops out of its socket. Rather, it involves setting specific goals, obtaining immediate feedback and concentrating as much on technique as on outcome.

Their work, compiled in the “Cambridge Handbook of Expertise and Expert Performance,” a 900-page academic book makes a rather startling assertion: the trait we commonly call talent is highly overrated. Or, put another way, expert performers — whether in memory or surgery, ballet or computer programming — are nearly always made, not born.

The odds can be shifted.
It is like poker. For cards it is like picking starting hands based on seeing 2 out of seven cards (selecting based on known information the starting embryo) and then also being given 1 redraw (genetic engineering after birth)
Also, with epigenetic control or RNA interference one could inactivate one or two of the cards that is known to be bad. You keep skewing things towards superior hands.

The more we know then the more we can shift the results and increase the odds and the occurrence of high intelligence.

If we reduce by 50% of those with IQ below 100 and double those with IQ above 150.
then the percentage of people who were effected by the procedures has 3 times the percentage occurrence of people with IQ above 150.

Pro-nuclear ruling in the USA

There has been an interpretation ruling on loan guarantees for 100% of the loan to a maximum of 80% of the total project for carbon dioxide reducing energy projects The loan guarantees make nuclear energy the cheapest option for energy. It pretty much guarantees that all 30 nuclear reactors that are in the planning stages in the USA will be built.

If a reasonable climate change bill is passed (likely in 2009) then the combination will cause a boom in nuclear power building in the United States. The amount of nuclear power would be projected to triple by 2030. 100-150 of the new larger reactors would be the primary source of any new power needs (along with more wind, solar and biomass and conservation and efficiency). The climate change bill will likely ratchet up the cost of coal plants and cause them to be shutdown and replaced over 20-30 year timeframe.

Thermoelectronics for more energy efficiency

Electric bikes and scooters in China and India key for clean global transportation.

Cars that are more fuel efficient than the Toyota Prius

Tracking increased orders for nuclear power around the world

EIA computational analysis of the projected impact of one of the climate bills

Increasing the power output of existing nuclear power plants

October 04, 2007

Human embryonic stem cells remain embryonic because of epigenetic factors

A human embryonic stem cell is reined in – prevented from giving up its unique characteristics of self-renewal and pluripotency – by the presence of a protein modification that stifles any genes that would prematurely instruct the cell to develop into heart or other specialized tissue.
Thanks to the simultaneous presence of different protein modifications, stem cells are primed and poised, ready to develop into specialized body tissue, Singapore scientists reported in last month’s issue of the journal Cell Stem Cell.

The molecules central to this balancing act, H3K4me3 and H3K27me3, are among the so-called epigenetic modifications that influence the activity patterns of genes in both human embryonic stem (ES) cells and mature human adult cells.

His GIS colleague, Wei Chia-Lin, Ph.D., who headed the Singapore research team, said, “This study demonstrates the power of a whole genome and robust sequencing technology, when applied in the epigenetic analysis of ES cells, can reveal features of the genomes that were not previously appreciated. The new knowledge and target candidate genes resulted from such unbiased study are ultimately important for researchers to understand the fundamental nature of stem cell proliferation and differentiation.”

Drs. Wei and Ng and the other researchers used cutting-edge technologies developed at GIS, to sequence, or decipher, the DNA of human ES cells. With the sequence data in hand, the scientists were able to categorize the genes into three groups, each modified by different combinations of the two epigenetic markers.

The researchers discovered that the majority of the regions in the genome harbor active histone marks that act as sign posts and allow cells to quickly find genes “to turn on” or activate them.

Identifying the locations of these genomic signposts will also be crucial for discovering human genes that are important for different functions in ES cells.

Of the two epigenetic markers, H3K4me3 was found to be the most prevalent – the scientists reported and noted that it occurs near the DNA areas that are promoters of two-thirds of human genes. Of the 17,469 nonredundant unique human genes that the scientists sequenced, 68% contained H3K4me3, and only 10% contained overlapping H3K27me3.

More information about epigenetic modifications:

In living cells, DNA is packaged along with histone proteins, which are chief protein components that act as spools around which DNA winds. The histone proteins are decorated with different marks, which can affect the various activities of the modified DNA such as transcription, gene silencing, imprinting and replication. Such marks key roles in the process of cellular differentiation, allowing cells to maintain different characteristics despite containing the same genomic material. While different cells can have identical genetic DNA sequences, their characteristics and differentiation patterns are influenced by the different marks on the histone proteins. Therefore, histone marks represent an epigenetic marker or code that can be used by the cells to expand their plasticity and complexity.

Superstrength polymers using layer by layer assembly

Superstrong transparent plastic has been made using superior layering technique.

Before this process load transfer has been quite ineffective at loadings more than 5-10%.
After that the mechanical properties of regular composites actually start degrading with increasing portion of inorganic filler. Kotov's findings indicate that organization of the composite and tuning of molecular interactions between clay and polymer can give almost ideal stress transfer at loadings as high as 50-60%. This can be achieved with a technique called layer-by-layer (LBL) assembly, which makes clay sheets to become oriented parallel to the substrate. Kotov explains that the LBL process is based on sequential adsorption of nanometer-thick monolayers of oppositely charged compounds (e.g. polyelectrolytes, charged nanoparticles, biological macromolecules, etc.) to form a multilayered structure with nanometer-level control over the architecture. Kotov points out that all polymer chains poorly bound to clay sheet are removed during their LBL assembly process.

"We solved the problem of load transfer quite well for clay sheets" says Kotov. "This is very encouraging. Potentially it can be solved for carbon nanotubes and other nanostructures as well." Specific applications for these clay composites are, of course, in military vehicles but also in aerospace and the automotive industry. Basically, these materials could be considered for components in unmanned aerial vehicles and electronic devices that are exposed to extreme performance conditions. "The future directions of this research will be increasing not only the strength and stiffness but also the strain of the composites, which will yield the toughness" says Kotov. "Having high strains would truly result in 'plastic steel.' We are also working on the better understanding the nanomechanics of these materials.

Thermoelectronics for cars, trucks, submarines, refridgerators and more

Michigan State University researchers believe, using thermoelectric generation technology, a 5% improvement in bsfc for and on highway truck is a reasonable 5 year goal 10% improvement possible with new Thermoelectric materials. They currently have a 40 watt thermoelectric generator for gathering waste heat from a truck exhaust. They can get an engine about 1-3.5% more efficient. By the end of 2007 they are targeting a 100 watt thermoelectric generator.

The total available energy in waste heat in transportation and industry is shown above.

Caterpillar is using a systems approach to capturing the 10% improvement and is already able to capture 5% improvement in the lab. They will be installing the system into engines and vehicles in the next phase of their project.

BMW has a system for 2010 that would be 2-3% more efficient (possible commercial release)

The DOE presentation on thermoelectronic improvement of vehicles.

The DOE Timeline is to introduce in production personal vehicles in the 2011 to 2014.

From a General Motors presentation on thermoelectronics

10% fuel economy improvement for a full size truck
- 1.65 kW – city
- 2.5 kW – Highway

350 W is the minimum requirement (remember University of Michigan expects to have 100W systems at the end of the year)
- equal to the base electrical load of today's generator on FTP, and would improve its composite Urban/Highway fuel economy by ~ 3%

Exhaust recovery can meet the 350 W requirement with existing materials with high starting purities but at high cost.

GM feels radiator recovery alone will not meet the 350 W requirement and is cost prohibitive.

Thermoelectronics need to be efficient and affordable. If you save $2000 in fuel costs per year with 10% more efficiency then the $/W needs to be reasonable.

Quantum well base thermoelectronic presentation
Projected quantum well modules at less then 30 cents per Watt.

Other applications

Current prototype: USS DOLPHIN AGSS 555 Thermoelectric Air Conditioning Test for Silent Running submarine.

R134-a refrigerant gas was universally adopted as the replacement for Freon.
However R134-a has 1,300 times* the global warming potential of CO2
Thermoelectronics can replace R134-a for air conditioners

Four Dispersed Solid StateThermoelectric Coolers/Heaters could comfortably cool or heat 5 occupants with 400 to 900 Watts of cooled or heated air.

Target is by 2020 to have 90% of US Personal Vehicle Fleet with a Thermoelectic Generator Powering a thermoelectric Cooler/Heater to replace R-134-a Refrigerant Gas Air Conditioners. Save 1.02 M bbls/day or 5% of US gasoline usage. Reduce the equivalent of 156 Million Metric Tons of CO2e Annually.

Existing Thermoelectric drink coolers could be improved for better full sized refidgerators

Thermoelectronics could reduce the weight of cooling and batteries for soldiers by 30%.

Current Vehicular Applications of Thermoelectrics
-Climate Control Seats
-Drink Cooler/Warmer
-Thermal Control of Electronics

Near Term Applications (2011 – 2015)
-Thermoelectric Generators Harvesting Engine Waste Heat
-Thermoelectric Coolers/Heaters replacing Air Conditioners
-Integrated Thermoelectric Generators & Coolers/Heaters Heavy Duty Truck Auxiliary Power Unit (APU)

Long Term (2020 +)
-Thermoelectric Generator Replacing Propulsion Engine
-Plug-in Solid State Hybrid with Multi Fuel Capability


A lot of the thermoelectric methods are nanoscale, using quantum dots, quantum wells and nanomaterials

More on the nanotechnology basis of many of the new superior thermoelectronic methods

October 03, 2007

Carnival of Space Week 23

Welcome to the Carnival of space week 23. We have two articles inspired by the 50th anniversary of Sputnick, then we have a mixed bag of articles on raising pigs on Mars, habitable planets, another look at the Apollo program, Nasa colorizing of space photos, the Carancas meteorite, Type 1a Supernovae, my own submission on hypersonic aircraft status, our wonderful planet Earth and there is a formula for the universe.


Cumbrian Sky talks about Space Age: RIP

As the 50th anniversary of the launch of Sputnik 1 approaches, amateur astronomer and spaceflight enthusiast Stuart Atkinson has mixed feelings about what has - and hasn't - happened in the last 50 years, and wonders if it's time to declare the "Space Age" dead...

Astroprof's Page discusses 50 Years Ago

This post is about the 50th anniversary of the launch of Sputnik-1. I also include a link for a wav of the Sputnik beeping.

UPDATE: Missed article=================
Astroblog also talks about the 50th Anniversary of Sputnik Launch
Colony Worlds talks about
Raising Pigs On Mars

The first explorers upon Mars will probably rely on supplies previously shipped to the red planet in order to survive upon this harsh world. But in order to settle on this crimson globe, future Martians will need to import fruits, vegetables, grain, trees and pigs--yes pigs.

Centauri Dreams talks about "Habitable Planets: A Splendid Isolation"

The story discusses recent work on planet formation and suggests habitable terrestrial worlds may form around stars that have had few or no interactions with other stars in their past. It's something of a brake on the idea that Earth-like planets are forming in binary systems, but it also contradicts recent research that they could indeed form there. Controversy should follow.

Music of the Spheres talks about Moonshadow Men

FlyingSinger has been busy with travel but did take time to see a wonderful new documentary about the Apollo program, “In the Shadow of the Moon.” Produced by Ron Howard and in limited theatrical release, the story is told through recent interviews with ten of the astronauts who journeyed to the Moon from 1968 to 1972. Inspiring stuff!

Universe Today talks about
True or False (Color): The Art of Extraterrestrial Photography

If you could see the surface of Mars, or a spectacular nebula with your own eyes, would it look the same as the beautiful images from Hubble, or the Mars rovers? Sometimes yes, mostly no. Universe Today has this
about the techniques astronomers use to coax the best images out of their data.

Spacefiles talks about the Carancas meteorite, Peru What have we learned of the meteorite that fell in Peru on Sept 15, 2007 ?

At Star Stryder talks about
Type 1a Supernovae: A Non-Standard Candle

Astronomers have been working hard to document the universe's rate of expansion as a function of time. To do this, we have to know how much light SN give off, and new research is indicating that this is going to get tricky as we look farther and farther back toward the beginning of the universe.

My own submission is an update and status of hypersonic aircraft and engine development

Olivier Lussier presents Home Sweet Home: Our Wonderful Earth posted at

A Babe in the Universe talk about a formula for the universe

An episode of STAR TREK hinted that the whole Universe could be expressed in a simple equation. Max Planck created a universal system of units based on c, h and G. Use of Planck's units leads to an amazingly simple result: "M = R = t"

This expression could tell a lot about the Universe, but may be ahead of its time.

Thermoelectronic potential and status

At the current efficiencies of thermoelectric devices, 7 to 8 percent, more than 1.5 billion gallons of diesel could be saved each year in the U.S. if thermoelectric generators were used on the exhaust of heavy trucks.

More than 60 percent of the energy that goes into an automotive combustion cycle is lost, primarily to waste heat through the exhaust or radiator system.

A 24 page powerpoint presentation from 2005 of the Freedomcar and vehicle technologies program to test waste heat recapture on cars

2007 information on the projects to get thermoelectrics into commercialization
The Department of Energy's Office of Freedom CAR and Vehicle Technologies initiated a program 13 years ago with Hi-Z Technologies to develop a 1 kW thermoelectric generator to either replace or augment the alternator in Heavy Duty Trucks. This unit was operated for the equivalent of 500,000 miles on the PACCAR test track. It demonstrated the feasibility of the concept which helped justify the competitive procurement to develop a commercially viable vehicular thermoelectric generator that would improve fuel economy by 10+ percent. Three teams were selected, 2 for Spark Ignition Gasoline engine powered autos and 1 for diesel engine powered heavy duty trucks. BSST teamed with Visteon, BMW and Marlow is scheduled to introduce their Thermoelectric Generator integrated into BMW's Model Year 2011 Series 5 cars producing 750 watts. BSST's parent company, Amerigon, has supplied over 4 million thermoelectric Climate Control Seats to GM, Ford, Toyota, Nissan, Hyundai and several other OEM's. They are the largest supplier of thermoelectric devices in the world. The General Motor's (GM's) Team includes General Electric (GE), NASA's Jet Propulsion Lab and the Oak Ridge National Lab. GM is planning to introduce a production car at the 2010 Detroit Auto Show with a 350 watt thermoelectric generator. Michigan State University is heading up a team with the Cummins Engine Company, NASA's Jet Propulsion Lab, and Tellurix to develop a thermoelectric generator heavy duty truck to operate on diesel engine exhaust.

Work is under way developing a vehicular thermoelectric air conditioning/heater system.

Nextreme is a company that makes confetti size thermoelectric devices

Physical size comparison of eTEC centered on standard thermoelectric cooler

A company called Powerchips believe that they can achieve 70-80% carnot efficiency. This compares very favorably with a Rankine cycle generator, such as a gas turbine, has a typical Carnot efficiency of about 30%, while a diesel or gasoline generator is only about 10-15% efficient in Carnot terms.

Since cars currently waste up to a third of the energy in gasoline as exhaust heat, using Power Chips to recover a significant fraction of that power could provide ample electricity for new systems, and could act as a replacement for existing alternators.

All of the presentations on capturing waste heat at the recent DEER conference on diesel engine efficiency

TG daily has some coverage of thermoelectrics for cars and trucks

The exhaust on diesel truck stacks can reach temperatures above 600F. This common, easily accessible and available portal provide a significant differential potential which could be tapped and then reclaimed as electrical energy. Such energy would be fed back into the truck in some manner (via a large electrical-assist motor?), thereby increasing fuel efficiency. Dr. Tritt believes such technology could save an estimated 1.5 billion dollars annually at only 7% to 8% efficiency.

Smelters and associated furnaces within the aluminum industry are generating very large amounts of waste heat, 21 trillion Btu per year.
This waste heat could be converted to electricity in many industrial processes by using new, high-temperature, quantum-well-thermoelectric materials. The technology also should be directly applicable to the recovery of energy losses in the petroleum refining, chemicals, forest products, iron and steel, food and beverage, cement, fabricated metals, transportation, textiles, mining, plastics, aluminum, and glass industries.

16 page agenda of the 13th annual diesel engine efficiency and emissions research conference (DEER)

All of the conference presentations

Oak Ridge National Laboratory research in 2007 for greater efficiency in cars and trucks includes thermoelectric. The near term goal is to capture 10% of the waste heat from a truck.

They are also worknig on overall engine efficiency:
In 2006, 150-horsepower diesel engines for passenger cars had a 41.5% thermal efficiency whereas 400-horsepower diesel engines for trucks had a 45% thermal efficiency. The DOE thermal efficiency goals are 45% by 2010 for diesel cars and 55% by 2012 for trucks. With widespread implementation of new improvements, we could realize a fuel reduction of 20%.

Directory of thermoelectric generators from PESwiki

Nanowerk reviewed thermoelectrics using custom nanostructured material

Cars that are more fuel efficient than a Toyota prius now and ones that will be available soon

October 02, 2007

Keeping Moore's law going: Intel High-K solution

Six web pages feature at IEEE Spectrum, that describes the technical achievement of High-K insulation for computer chips.

IBM on track to make room temperature graphene field-effect transistors devices

Using conventional e-beam lithography, IBM has successfully fabricated graphene FETs with very narrow nanoribbon channels. FETs are field-effect transistors. So far, the bandgaps opened were relatively small, compared with the excellent properties of nanotubes. This IBM attributes to the imperfections in its method of cutting the nanoribbons. However, by supercooling the device the researchers were able to prove the concept. Next, they plan to further narrow the nanoribbons to achieve room temperature operation.

The first applications of the nanoribbon FETs will be for RF devices in Darpa's Carbon Electronics for RF Applications (CERA) program. The high electron mobility of graphene makes it an excellent candidate for analog ultra-high-frequency oscillators and switches.

"We think analog graphene devices will have wide applicability in communications, radar and other areas requiring very-high-frequency operation," said Avouris.

IBM used the mechanical exfoliation method to place graphene atop a silicon wafer for its current device; in the future, they plan to also pursue growing graphene on silicon-carbide wafers. By heating a silicon-carbide wafer in a high vacuum to evaporate the silicon atoms from the top layer, it is possible to leave behind a monolayer of pure carbon in graphene's crystalline lattice.

The graphene FET's performance is not quite as good as carbon nanotubes, but graphene's electron mobility is at least an order of magnitude [10X] greater than silicon

The graphene FET's could achieve ten times the speed of ordinary CMOS transistors. If the graphene FET's are made smaller with nanometer features then they could have corresponding better speed to CMOS of the same dimensions.

A discussion of the challenges and promise of converting to graphene electronics

Future graphene-chip technologies, meanwhile, could borrow many of the methods already used for creating silicon chips. Many scientists are seeking ways of chiseling narrow strips, called nanoribbons, out of graphene sheets. Graphene electronics is far from proved as a viable candidate for the postsilicon era. As yet, graphene transistors are slower than silicon ones and much slower than transistors made with competing materials such as carbon nanotubes.

No one is ready to make promises, especially in light of the experience with carbon nanotubes. "Carbon nanotubes promised so much and so far [have] delivered so little, and we should naturally be cautious about promising too much for graphene," Geim says.

Cees Dekker of Delft University of Technology in the Netherlands, who a decade ago created the first nanotube transistor (SN: 5/9/98, p. 294), says that scientists' excitement about graphene gives him a feeling of déjà vu. "Sometimes, people are enthusiastically rediscovering the properties of graphene which were already heavily discussed 10 years ago in conjunction to nanotubes," he says.

MOSFET's at wikipedia

Chronic disease costs over 1 trillion per year to the US economy

The Milken institute has calculated the cost of the seven major chronic diseases and broken it down by state and projected those costs to 2023 by year and also to 2030 and 2050

The seven major diseases are:
Cancers: 10.6 million cases (3.6% percent of US population)
Diabetes: 13.7 (4.7%)
Heart Disease: 19.1 (6.6%)
Hypertension: 36.8 (12.6%)
Stroke: 2.4 (0.8%)
Mental Disorders: 30.3 (10.4%)
Pulmonary Conditions: 49.2 (16.9%)

Total Reported Cases: 162.2 million (55.8%)

United States Economic Impact 2003 (Annual Costs in billions)
Treatment Expenditures: $277.0B
Lost Productivity: $1,046.7B
Total Costs: $1,323.7B

Air quality, which is made worse by coal and oil use for energy and transportation, is a contributing risk factor

The projected impact on GDP by state from 2004 to 2050

the project life expectency of a 65 year old person by year and by state.

If we are able to cure those diseases or greatly reduce their occurence with infrastructure changes (getting rid of coal and oil usage) then we are looking at massive economic benefits in the United States and the world.

Trillion mistakes from bad societal choices.

Better economy and budgets by getting rid of coal

October 01, 2007

Catalysts to stamp nanopatterns with 1 nanometer precision

Using enzymes from E. coli bacteria, Duke University chemists and engineers have introduced a hundred-fold improvement in the precision of features imprinted to create microdevices such as labs-on-a-chip.

Their inkless microcontact printing technique can imprint details measuring close to 1 nanometer, or billionths of a meter, the Duke team reported in the Sept. 24, 2007 issue of the Journal of Organic Chemistry.

In traditional microcontact printing -- also called soft lithography or microstamping -- an elastic stamp’s end is cast from a mold created via photolithograpy – a technique used to generate microscopic patterns with light. Those patterns are then transferred to a surface by employing various biomolecules as inks, rather like a rubber stamp.

Microcontact printing was first reported by Ralph Nuzzo and Dave Allara at Pennsylvania State University, and developed extensively in the laboratory of George Whitesides at Harvard.

A shortcoming of traditional microcontact printing is that pattern transfer relies on the diffusion of ink from the stamp to the surface. This same diffusion spreads out beyond the limits of the pattern as the stamp touches the surface, degrading resolution and blurring the feature edges, Clark and Toone said.

Because of this mini-blurring, the practical limit to defect-free patterning is “in excess of 100 nanometers,” said the report, whose first author, Phillip Snyder, is a former Toone graduate student now working as a postdoctoral researcher in Whitesides’ group.

A 100 nanometer limit of resolution is about 1,000 times tinier than a human hair’s width. While that seems very precise, the Duke team now reports it can boost accuracy limits to less than 2 nanometers by entirely eliminating inking.

Clark and graduate student Matthew Johannes crafted a microstamp out of a gel-like material called polyacrylamide, which compresses more uniformly than the silicone material known as PDMS which is normally used in microstamping.

In lieu of ink, Snyder, Toone and graduate student Briana Vogen suspended a biological catalyst on the stamp with a molecular “tether” of amino acids. For this proof-of-principle demonstration, Toone’s team chose as a catalyst the biological enzyme exonuclease I, derived from the bacterium E. coli.

In one set of experiments, the polyacrylamide stamp pattern bearing the tethered enzymes was then pressed on a surface of gold that had been covered with a uniform coating of single-stranded DNA molecules. The DNA molecules had also been linked to fluorescent dye molecules to make the coating visible under a microscope.

Wherever the enzyme met the DNA, the end of the DNA chain and its attached dye were broken off and removed. That created a dye-less pattern of dots on the DNA coating, each dot measuring about 10 millionths of a meter diameter each.

In follow-up research, Clark and Toone are now evaluating more durable microstamping materials attached to longer lasting catalysts that are non-enzymatic.

By using different catalysts in succession, future versions of the inkless technique could be used to build complex nanoscale devices with unprecedented precision, the two predicted.

Nanopantography is another recent method for creating billions of ion beams for creating repeated

IBM nanogravure printing achieves 100,000 dots per inch (dpi) and 60 nanometer resolution now IBM scientists believe that this method will enable them to place particles as small as 2 nm in diameter to fabricate atomic scale nanowires, ultra tiny lenses for optics and biosensors for healthcare.

thermochemical nanolithography (heat up an AFM tip, writes 10,000+ faster than dip pen, mm/sec) dimensions down to 12 nanometers in width, fracture induced structuring (60 nm lines) are also methods of working at near nanometer scales for nanopatterning

Gunships and precision artillery

There is need for more responsive and effective close air support for US troops in Iraq.

C-27J transport plane

War is boring discusses converting $25-30 million C27 transport planes into slightly scaled down versions of the AC-130 gunship.

AC-130 gunship

There has been a proposal to use AC-130's in a roving mode over Iraq. This would enable the AC-130's to be 20 minutes from any situation that they need to be called by ground troops.

Killing insurgents with M-16s and F-16s is tough, dangerous, complicated, expensive, etc. Killing insurgents with an ammo-laden transport aircraft that can loiter in the Sunni triangle for 10+ hours every night shooting bullets that cost pennies compared to other means of killing insurgents and now we've got a chance of winning the war without bankrupting our country.

I will also say that the Arab culture respects strength. If the gunships were unleashed, the only defense would be to stop attacking U.S. forces. Again, you'd get away with some attacks but it would only be a matter of time before a gunship or another air asset caught you or a US soldier called quick enough to get the gunship in place. How many hours long battles have you read about in the paper. Why? Other air assets respond but only the gunship has the situational awareness and the ability to shoot a single 40-mm round at a time to efficiently kill insurgents and not cause collateral damage. The gunship is the only air asset I know that shows up on scene and quicly has more situational awareness than the ground forces. Too many times I've told ground forces that personnel were sneaking up on them and that we were 10 seconds away from a round on target the second they gave the command. No other asset compares (in a low-threat environment like Iraq). There are CAS aircraft and then there is the AC-130. How many times have you read about other air assets making low passes and dispensing flares to scare away the enemy after they've attacked our forces. Why are we asking our pilots to fly hundreds of feet from the ground to dispense flares? Our pilots' bravery is unquestioned but there has to be a better way. When you have enemy forces attacking your forces, they need to be killed not scared away. I'll say it again, the Arab respects strength.

Another weapon is the Army's Guided Multiple Launch Rocket System (GMLRS), which provides close fire support.

U.S. Army commanders and troops have come to view the Army's Guided
Multiple Launch Rocket System (GMLRS) as their "70-kilometer sniper

Its shots covers 70 kilometers within 82 seconds. It is an army rocket launcher. It hits within 5 meters of the target. Tests back in 1999 got within 2.1 meters of target. Getting closer means smaller explosives are needed. This allows for close
combat support. ie. you are in an urban fire fight you can call in close support artillery to take out the guys a few rooms over if you can pin them down for a minute but cannot take them out. Or take out the guys across the street.

Hypersonic progress for engines and planes

Successful recent ground tests of jet-fueled, ramjet/scramjet demonstrator engines by Pratt & Whitney Rocketdyne and Aerojet represent important progress toward flight-testing of three separate hypersonic-vehicle programs.

Using JP-7 jet fuel, PWR ran the combustor successfully at a variety of Mach numbers from Mach 2.5 to Mach 6.0, demonstrating "desired operability and performance" at each speed, the company said.

PWR's approach is to use a closed-loop or "heat sink" system, whereby the fuel is pumped as a coolant throughout the engine casing to remove heat and pressure from the combustor. This 3,000-degree heat also prepares the jet fuel for combustion by cracking it into smaller molecules that burn very quickly when they enter the combustor.

A full-sized version of PWR's combustor will form the heart of the FaCET program, sponsored by the U.S. Defense Advanced Research Projects Agency (DARPA) and the U.S. Air Force. Lockheed Martin is FaCET prime contractor.

FaCET aims to develop a hypersonic test vehicle -- which could fly in 2012 -- that would take off and land by itself, use an advanced turbojet to get up to a speed of at least Mach 4 and then use a liquid hydrogen-powered scramjet to get to Mach 10 and beyond. Jet fuel can't be used as a scramjet fuel at speeds as high as Mach 10.

FaCET isn't linked to the DARPA/U.S. Air Force/NASA X-51A hypersonic aircraft that is due to fly in 2009. But PWR, which is making the JP-7-powered X-1 scramjet engine for the Boeing-built X-51A, uses what it learns from each program to improve both engines.

"The engines are not the same shape or configuration but, technology-wise, the FaCET engine incorporates much of what we've learned through the X-51 engine," said McKeon. "The flip side is that we also have learned stuff with this (FaCET) engine regarding different configurations that could also be used in future X-51 activity."

There are also tests planned for the airframe to make sure it can survive the speed and temperatures The Falcon was to fly Sept 2007 but has been delayed.

Hypersonics rockets for space launches could have an ISP (rocket fuel efficiency) of about 2800 (with good designs) which is over 6 times better than the best chemical rockets (450 ISP)

I think there are better technology that we can develop for far cheaper access to space and that any manned hypersonic plane system will take 15-30 years. However, if they can get this working that would be great.

Skylon space plane concept

$100 human genome project

BioNanomatrix Inc. and Complete Genomics Inc. said Thursday they have formed a joint venture that has received an $8.8 million government grant to develop a system capable of sequencing the entire human genome in eight hours at a cost of less than $100. Today, the cost of sequencing the roughly 3 billion base pairs in the human genome is more than $100,000.

The grant for the five-year project to BioNanomatrix of Philadelphia and Complete Genomics of Sunnyvale, Calif., was awarded through the National Institute of Standards and Technology Advanced Technology Program. The venture will combine Complete Genomics' novel sequencing chemistry with BioNanomatrix Inc.'s advanced nanofluidic platform, he said, which allows single molecules of DNA, RNA or other proteins to be separated out of laboratory samples for imaging and analysis.

BioNanoMatrix and Complete Genomics have proposed adapting a novel DNA sequencing chemistry combined with nanoscale DNA imaging to create a system that can "read" very long DNA sequences of greater than 100,000 bases at high speed and with accuracy exceeding the current industry standard.

The total project cost is expected to be approximately $17.8 million, including both the grant award from NIST-ATP and the matching funds that will be provided by the joint venture partners.

Another funded genome sequencing project: UC Irvine’s plan to AFM nanotechnology with a Nobel Prize-winning DNA sequencing method developed in 1975 by Frederick Sanger. The process will employ a novel DNA separation method using the atomic force microscope (AFM), a Wickramasinghe invention. Researchers will then decode the DNA sequence with the help of light concentrated at a probe that is about 50 atoms wide at its tip. It will take substantially less time to sort, analyze and then map DNA using this technique, since the procedure operates on a much smaller scale than the conventional Sanger method. This new process has the capability to produce accurate results that are both 10,000 times faster and less expensive to obtain, since many of the expenses related to current methods of sequencing DNA are tied to the time it takes and the large amount of chemicals used.

Personal genome project

The $1000 and $100,000 genome grants awarded Aug 1, 2007 (NHGRI's Revolutionary Genome Sequencing Technologies grants)

Links to the detailed project proposal are here at, genome technology program

There is also the large scale genome sequencing programs

Using Terahertz Radiation to Control Material Properties

Ultrafast pulses of terahertz radiation has been used to change a manganite crystal from an electrical insulator into a conductor

This work is using terahertz radiation to change the properties of a solid crystal material by 100,000 times at super high speeds. It is better and more useful than ancient dreams of alchemy.

The ability to induce dramatic phase-changes in solid materials through select vibrations holds great promise for future exploitation of prized technological phenomena such as superconductivity and magnetoresistance. The methods present a new way of studying electron correlation effects and the coupling between crystal structure and the conduction properties of strongly correlated electrons.

Rini, working under Schoenlein and with a group of collaborators that included Ra’anan Tobey, Nicky Dean, Jiro Itatani, Yasuhide Tomioka, Yoshinori Tokura and Andrea Cavalleri, flashed single crystals of the strongly correlated manganite with femtosecond pulses of terahertz (trillion-cycles-per-second) radiation. Terahertz (abbreviated THz) radiation is the frequency of molecular vibrations; the femtosecond (millionths of a billionth of a second) timescale is the measure of atoms in motion.

Rini, Schoenlein and their colleagues found that a frequency of about 17 THz set off vibrations in the manganite crystal which resulted in a stretching of the electronic bonds that connect its principal constituent atoms - manganese and oxygen. This mild distortion of the crystal’s geometry caused a profound change in its electronic properties.

“By selectively exciting an individual vibrational mode of the insulating manganite, we increased the crystal’s electrical conductivity by five orders of magnitude,” said Rini. “What we observed was that the excitation of the manganese-oxide molecule’s vibrational mode promptly induced an ultrafast transition of the molecule to a metallic phase.”

This marks the first experimental demonstration that the selective excitation of a single vibrational mode can be used to induce phase changes in a crystal. It also demonstrates that the dynamics of a phase change in a solid can be observed when the solid resides in the electronic ground state - the electronic state in which most chemical reactions and phase transitions take place.

In the future, Rini said the Schoenlein group would like to use longer wavelength radiation to selectively excite other vibrational modes, and femtosecond x-ray beams to explore other aspects of vibrationally induced phase transitions. For now, their experimental technique is already shedding new light on the physics behind CMR, which should prove valuable for the future use of this phenomenon in magnetic data storage devices. The technique might also be used to address the unresolved physics behind the phenomenon of high-temperature superconductivity – copper-oxide (cuprate) materials that lose all electrical resistance at temperatures much higher than conventional superconductors.

“The complex and remarkable behavior of strongly correlated electron systems poses among the most intriguing questions in condensed matter physics,” said Rini. “Our vibrational excitation approach enables time-resolved measurements under the unique conditions created by the localization of energy in specific vibrational modes, and helps elucidate the coupling between particular vibrations and related electronic and magnetic properties. We believe our technique will find extensive application in other complex solids.”

Progress towards 1 nanometer x-ray resolution

At Brookhaven's National Synchrotron Light Source (NSLS), the scientists exceeded a limit on the ability to focus "hard," or high-energy, x-rays known as the "critical angle."

The researchers implemented their idea by creating a compound lens from a series of four kinoform lenses placed one after the other. Using this setup at NSLS beamline X13B, they showed that the critical angle can be surpassed with hard x-rays, while still focusing like a single lens.

"Without exceeding the critical angle, the refractive lens resolution would be limited to 24 nanometers or more," Ablett said. "Even though in this experiment we just barely exceeded this limit, we've shown that it can be done. This is just the first step."

This is an important step for the National Synchrotron Light Source II (NSLS-II), a state-of-the-art synchrotron facility that will produce x-rays up to 10,000 times brighter than those generated by the current NSLS and could lead to advances such as alternative-energy technologies and new drugs for fighting disease. One of the major goals of the facility is to probe materials and molecules with just one-nanometer resolution - a capability needed to study the intricate mechanisms of chemical and biological systems.

September 30, 2007

Japan aims for 10 gigabit per second network by 2015

Japan's National Institute of Information and Communications Technology and private companies aim to develop and commercialise in around 2015 a network that can transfer data at 10 gigabits per second, 10 times faster than the 1 gbps next-generation network due to be launched in Japan this yea.

The group will be joined by such companies as Nippon Telegraph and Telephone Corp., Fujitsu Ltd., KDDI Corp., Hitachi Ltd., Toshiba Corp. and NEC Corp. They will spend some 30 billion yen (260 million dollars) on the research project over the next five years. The optical network would allow as many as 100 billion devices to access it simultaneously and still enjoy extremely fast data-transfer speeds.

In July 207, Ciena Corporation (NASDAQ:CIEN), the network specialist, announced that JANET(UK) has successfully delivered its first 40 Gbps service in a production environment across JANET, the UK's national research and education network, using the new 40 Gbps capabilities of Ciena's CoreStream®Agility Optical Transport System.

In June 2007, Level 3 Communications' Business Markets Group announced that the Internet2 nationwide, dedicated network services backbone was operational. This new 100 Gigabits per second (Gbps) network delivers an immediate increase in bandwidth and the capability for future scalability to enable emerging applications for the Internet2 research and education community. The network delivers the underlying optical technology required for Internet2 to dynamically provision multiple 10 Gbps wavelengths. The milestone enables Internet2 to remain on schedule to complete member migration from the existing network to this new backbone by later in 2007.