Pages

January 30, 2010

VASIMR Plasma Rocket for a Lunar Tug


NASA pre-solitication of a lunar tug using VARIABLE SPECIFIC IMPULSE MAGNETO-PLASMA ROCKET - VASIMR

Studies will be conducted to evaluate a Lunar Tug concept utilizing Variable Specific Impulse Magneto-plasma Rocket (VASIMR) engine capabilities from Low Earth Orbit to Lunar Orbit and libration points. NASA/JSC intends to purchase these services from Ad Astra Rocket Company. Adequate relevant cooling of the High Temperature Superconducting(HTSC) magnets require the conversion ofthe original HTSC magnet facility to utilize space-relevant cryo-cooler technology that has not yet been utilized under conditions that are relevant to VASIMR operation inspace.



5 page VASIMR business plan summary

VASIMR has advantages over other competing plasma rockets due to its electrode-less design and its use of inexpensive and abundant propellants such as argon, neon and hydrogen. Other systems tend to suffer from wear and erosion of electrodes immersed in the hot plasma; also, the use of Xenon propellant in these systems tends to make them much more expensive to operate. The current price of commercial Xenon is about $2000/kg vs. Argon at about $40/kg.

In 2009, Ad Astra demonstrated the full power operation of the 200kW VX-200, the first VASIMR flight-like prototype. This test will pave the way for the design and construction of the VF-200, the first flight unit, expected to be launched into space in late 2013.

Major milestones in this testing program have been achieved, including:
1. First plasma May 2008
2. Full (30kW) first stage power demo Oct. 2008
3. Second stage integration Jan. 2009
4. VX-200 full (2Tesla) magnetic field July 2009
5. VX-200 at full (200kW) rated power Oct. 2009

For robotic resupply missions to future human lunar outposts, Ad Astra is designing a 2MW solar powered VASIMR lunar tug capable of delivering more than twice the payload to the Moon (~34MT,) as compared to the all-chemical stage presently envisioned (~16MT.) Such enhancements in payload capability could result in savings of up to $400M/year over the present lunar resupply architecture.



















Bidhere.com

VASIMR Lunar Tug





Conceptual Mission Using Three VASIMR Rockets to go to Mars



Advertising

Trading Futures
 
Nano Technology
 
Netbook     Technology News
 
Computer Software
   
Future Predictions



Intel, Micron Make 25-nm NAND chip as Moore Law Marches On

EEtimes reports Intel Corp. and Micron Technology Inc. have regained the process technology lead in NAND flash, by rolling out the first in a family of 25-nm devices.

The first 25-nm NAND device is a multi-level-cell (MLC), 8-GB device, which is said to reduce IC count by 50 percent over previous products. With the device, measuring 167-mm2, the Intel-Micron duo will retake the NAND process lead over the SanDisk-Toshiba duo and Samsung Electronics Co. Ltd., which have recently announced 32-nm and 30-nm products, respectively. Another player, Hynix Semiconductor Inc., has a 26-nm device waiting in the wings.

The 25-nm device is made at IM Flash Technologies LLC, a joint NAND fab venture between Intel (Santa Clara) and Micron (Boise, Ida.). Intel and Micron will initially ramp the 25-nm NAND device at IM Flash, followed by production within Micron's fab in Manassas, Va. Still to be seen, however, is when IM Flash will restart its delayed NAND fab in Singapore. Some analysts say that fab will ramp in 2011.



Bidhere.com

In theory, today's 193-nm immersion scanners supposedly hit the wall around 35-nm. IM Flash has been able to devise 25-nm NAND chips with today's 193-nm immersion lithography, plus self-aligned double-patterning (SADP) techniques, observers speculated. It is widely believed that IM Flash is using scanners from ASML Holdings NV and SADP technology, observers speculated.


IM Flash may also be using a form of phase-shift mask technology. ''With the chip industry staying on Moore's Law and lithography stuck at the 193-nm wavelength, chipmakers are looking to double-patterning to drive linewidth shrinks,'' according to a recent report from Barclays Capital.

''SADP is the technology of choice in NAND, with all players adopting SADP at the 32-nm node. In our view, SADP was really the only choice due to (i) inadequate overlay and line edge roughness capabilities of the then existing litho tools, (ii) the simple nature of NAND 1-D structure, and (iii) availability of excess etch and CVD tool capacity,'' according to the report.

''Looking to the 22-nm node, our checks suggest that SADP is the preferred option for all the major NAND manufacturers as development is already underway and litho tools by themselves alone are not yet ready to satisfy the requirements at 22-nm,'' according to the report.






Stem Cell News - Bone Marrow for Stem Cell tissue Transpants and Skin cells easily turned into Brain Cells

MIT Technology Review - Skin Cells Turned into Brain Cells

Skin cells called fibroblasts can be transformed into neurons quickly and efficiently with just a few genetic tweaks, according to new research. The surprisingly simple conversion, which doesn't require the cells to be returned to an embryonic state, suggests that differentiated adult cells are much more flexible than previously thought.

If the research, published in the journal Nature yesterday, can be repeated in human cells, it would provide an easier method for generating replacement neurons from individual patients. Brain cells derived from a skin graft would be genetically identical to the patient and therefore remove the risk of immune rejection--such an approach might one day be used to treat Parkinson's or other neurodegenerative diseases.

"It's almost scary to see how flexible these cell fates are," says Marius Wernig, a biologist at the Institute for Stem Cell Biology and Regenerative Medicine at Stanford, who led the research. "You just need a few factors, and within four to five days you see signs of neuronal properties in these cells."



Nature - Direct conversion of fibroblasts to functional neurons by defined factors

Cellular differentiation and lineage commitment are considered to be robust and irreversible processes during development. Recent work has shown that mouse and human fibroblasts can be reprogrammed to a pluripotent state with a combination of four transcription factors. This raised the question of whether transcription factors could directly induce other defined somatic cell fates, and not only an undifferentiated state. We hypothesized that combinatorial expression of neural-lineage-specific transcription factors could directly convert fibroblasts into neurons. Starting from a pool of nineteen candidate genes, we identified a combination of only three factors, Ascl1, Brn2 (also called Pou3f2) and Myt1l, that suffice to rapidly and efficiently convert mouse embryonic and postnatal fibroblasts into functional neurons in vitro. These induced neuronal (iN) cells express multiple neuron-specific proteins, generate action potentials and form functional synapses. Generation of iN cells from non-neural lineages could have important implications for studies of neural development, neurological disease modelling and regenerative medicine.

2.
Physorg report - Using cells from mice, scientists from Iowa and Iran have discovered a new strategy for making embryonic stem cell transplants less likely to be rejected by a recipient's immune system. This strategy, described in a new research report appearing in the February 2010 print issue of The FASEB Journal, involves fusing bone marrow cells to embryonic stem cells. Once fused, the hybrid cells have DNA from both the donor and recipient, raising hopes that immune rejection of embryonic stem cell therapies can be avoided without drugs.

FASEB Journal - Cell fusion of bone marrow cells and somatic cell reprogramming by embryonic stem cells

Bone marrow transplantation is a curative treatment for many diseases, including leukemia, autoimmune diseases, and a number of immunodeficiencies. Recently, it was claimed that bone marrow cells transdifferentiate, a much desired property as bone marrow cells are abundant and therefore could be used in regenerative medicine to treat incurable chronic diseases. Using a Cre/loxP system, we studied cell fusion after bone marrow transplantation. Fused cells were chiefly Gr-1+, a myeloid cell marker, and found predominantly in the bone marrow; in parenchymal tissues. Surprisingly, fused cells were most abundant in the kidney, Peyer’s patches, and cardiac tissue. In contrast, after cell fusion with embryonic stem cells, bone marrow cells were reprogrammed into new tetraploid pluripotent stem cells that successfully differentiated into beating cardiomyocytes. Together, these data suggest that cell fusion is ubiquitous after cellular transplants and that the subsequent sharing of genetic material between the fusion partners affects cellular survival and function. Fusion between tumor cells and bone marrow cells could have consequences for tumor malignancy.—Bonde, S., Pedram, M., Stultz, R., Zavazava, N. Cell fusion of bone marrow cells and somatic cell reprogramming by embryonic stem cells.




Trihydrides Appear to be a More Promising Path to Superconducting Metallic Hydrogen

High-pressure researchers, including Carnegie’s Ho-kwang (Dave) Mao, have now modeled three hydrogen-dense metal alloys and found there are pressure and temperature trends associated with the superconducting state—a huge boost in the understanding of how this abundant material could be harnessed. Computer modeling indicates that superconductivity for trihydrides (Hydrogen compounds with 3 hydrogens) set in at pressures between roughly 100,000 to 200,000 times atmospheric pressure at sea level (10 to 20 GPa), which is an order of magnitude lower (ten times less) than the pressures for related compounds that bind with four hydrogens instead of three. the hope has been that metallic superconducting hydrogen could be a room temperature superconductor.

Physicists have long wondered whether hydrogen, the most abundant element in the universe, could be transformed into a metal and possibly even a superconductor—the elusive state in which electrons can flow without resistance. They have speculated that under certain pressure and temperature conditions hydrogen could be squeezed into a metal and possibly even a superconductor, but proving it experimentally has been difficult.

Scientists have found that in addition to chemical manipulation to raise the transition temperature, superconductivity can also be induced by high pressure. Theoretical modeling is very helpful in defining the characteristics and pressures that can lead to high transition temperatures. In this study, the scientists modeled basic properties from first principles—the study of behavior at the atomic level—of three metal hydrides under specific temperature, pressure, and composition scenarios. Metal hydrides are compounds in which metals bind to an abundance of hydrogen in a lattice structure. The compounds were scandium trihydride (ScH


3),yttrium trihydride (YH3) and lanthanum trihydride (LaH3).

“We found that superconductivity set in at pressures between roughly 100,000 to 200,000 times atmospheric pressure at sea level (10 to 20 GPa), which is an order of magnitude lower than the pressures for related compounds that bind with four hydrogens instead of three,” remarked Mao, of Carnegie’s Geophysical Laboratory. Lanthanum trihydride stabilized at about 100,000 atmospheres and a transition temperature of – 423°F (20 Kelvin), while the other two stabilized at about 200,000 atmospheres and temperatures of -427 °F (18 K) and -387 °F (40 K) for ScH3 and YH3 respectively.



PNAS- General trend for pressurized superconducting hydrogen-dense materials

The long-standing prediction that hydrogen can assume a metallic state under high pressure, combined with arguments put forward more recently that this state might even be superconducting up to high temperatures, continues to spur tremendous research activities toward the experimental realization of metallic hydrogen. These efforts have however so far been impeded by the enormous challenges associated with the exceedingly large required pressure. Hydrogen-dense materials, of the MH4 form (where M can be, e.g., Si, Ge, or Sn) or of the MH3 form (with M being, e.g., Al, Sc, Y, or La), allow for the rather exciting opportunity to carry out a proxy study of metallic hydrogen and associated high-temperature superconductivity at pressures within the reach of current techniques. At least one experimental report indicates that a superconducting state might have been observed already in SiH4, and several theoretical studies have predicted superconductivity in pressurized hydrogen-rich materials; however, no systematic dependence on the applied pressure has yet been identified so far. In the present work, we have used first-principles methods in an attempt to predict the superconducting critical temperature (Tc) as a function of pressure (P) for three metal-hydride systems of the MH3 form, namely ScH3, YH3, and LaH3. By comparing the obtained results, we are able to point out a general trend in the Tc-dependence on P. These gained insights presented here are likely to stimulate further theoretical studies of metallic phases of hydrogen-dense materials and should lead to new experimental investigations of their superconducting properties.

The researchers also found that two of the compounds, LaH3 and YH3, had more similar distributions of vibrational energy to each other than to ScH3 at the superconducting threshold and that the transition temperature was highest at the point when a structural transformation occurred in all three. This result suggests that the superconducting state comes from the interaction of electrons with vibrational energy through the lattice. At pressures higher than 350,000 atmospheres (35 GPa) superconductivity disappeared and all three compounds became normal metals. In yttrium trihydride, the superconductivity state reappeared at about 500,000 atmospheres, but not in the others. The scientists attributed that effect to its different mass.

“The fact that the models predicted distinctive trends in the behavior for these three related compounds at similar temperatures and pressures is very exciting for the field,” commented Mao. “Previous to this study, the focus has been on compounds with four hydrogens. The fact that superconductivity is induced at lower pressures in the trihydrides makes them potentially more promising materials with which to work. The temperature and pressures ranges are easily attainable in the lab and we hope to see a flurry of experiments to bear out these results.” The team at Carnegie has embarked on their own experiments on this class of trihydrides to test these models.

OTHER NEW SUPERCONDUCTING RESEARCH

1. PNAS - High-pressure crystal structures and superconductivity of Stannane (SnH4)

There is great interest in the exploration of hydrogen-rich compounds upon strong compression where they can become superconductors. Stannane (SnH4) has been proposed to be a potential high-temperature superconductor under pressure, but its high-pressure crystal structures, fundamental for the understanding of superconductivity, remain unsolved. Using an ab initio evolutionary algorithm for crystal structure prediction, we propose the existence of two unique high-pressure metallic phases having space groups Ama2 and P63/mmc, which both contain hexagonal layers of Sn atoms and semimolecular (perhydride) H2 units. Enthalpy calculations reveal that the Ama2 and P63/mmc structures are stable at 96–180 GPa and above 180 GPa, respectively, while below 96 GPa SnH4 is unstable with respect to elemental decomposition. The application of the Allen-Dynes modified McMillan equation reveals high superconducting temperatures of 15–22 K for the Ama2 phase at 120 GPa and 52–62 K for the P63/mmc phase at 200 GPa.

2. PNAS - Percolative theories of strongly disordered ceramic high-temperature superconductors

Optimally doped ceramic superconductors (cuprates, pnictides, etc.) exhibit transition temperatures Tc much larger than strongly coupled metallic superconductors like Pb (Tc = 7.2 K, Eg/kTc = 4.5) and exhibit many universal features that appear to contradict the Bardeen, Cooper, and Schrieffer theory of superconductivity based on attractive electron-phonon pairing interactions. These complex materials are strongly disordered and contain several competing nanophases that cannot be described effectively by parameterized Hamiltonian models, yet their phase diagrams also exhibit many universal features in both the normal and superconductive states. Here we review the rapidly growing body of experimental results that suggest that these anomalously universal features are the result of marginal stabilities of the ceramic electronic and lattice structures. These dual marginal stabilities favor both electronic percolation of a dopant network and rigidity percolation of the deformed lattice network. This “double percolation” model has previously explained many features of the normal-state transport properties of these materials and is the only theory that has successfully predicted strict lowest upper bounds for Tc in the cuprate and pnictide families. Here it is extended to include Coulomb correlations and percolative band narrowing, as well as an angular energy gap equation, which rationalizes angularly averaged gap/Tc ratios, and shows that these are similar to those of conventional strongly coupled superconductors.






January 29, 2010

World Nuclear Power for 2009 and 2010 and up to 2014

There are scheduled to be the following new reactors

2010 9 new reactors, 6.2 GWe (shifted the two Canadian Reactors to 2011)
2011 11 new reactors, 9.3 GWe
2012 10 new reactors, 9.92 GWe
2013 12 new reactors, 13.08 GWe
2014 14 new reactors, 13.63 GWe

France, Germany and Sweden had significantly lower nuclear power generation in 2009. The United States and South Korea also had a slight under performance. There were varying degrees of operational issues in each country.

There was growth in generation in Japan, Russia and the UK. This was due to some earthquake damaged reactors being brought back online in Japan and overall capacity factor improvements in all three countries.

The russian Rostov 2 (950 MWe) reactor listed in the 2010 table, actually started commercial operationon Dec 19, 2009

India's larger reactor startup schedule

Kudankulam 1 Sept 2010
Kudankulam 2 March 2011

Kaiga 4 (220 MWe) March 2010

Rawatbhata 5 (starting Jan 2010)
Rawatbhata 6 (starting Feb 2010)


Canada's Bruce A1 and A2 reactors will not start until the latter half of 2011

The argentinian reactor Atucha II is scheduled for an October 2010 start.



Actual and Projected Nuclear Power Generation

OECD data up to October. OECD projections for November and December Actual numbers for yearend in the USA, Russia and Japan. Estimates for 2010 based on estimate based on 2008 and 2009 performance and new reactors coming online in 2010.

Pending nuclear uprates in the US

US nuclear uprate applications (2010-2014)

Power Generation and Uranium Production Bets

In terms of the series of nuclear power generation and uranium production bets that I have with Dittmar. My expectation is that I lost the first of the nuclear generation bets and won the first of the uranium production bets. I expect to win the remaining years of the nuclear generation bets and uranium production bets.

From the table above with a country by country projection
2009 2568 TWhe (not final number)
2010 2703 TWhe (scheduled completions, uprates, India fuel supply no longer a problem - 2009 agreements, Japan capacity similar to Dec 2009 when 26 TWhe was generated, France labor and operational issues resolved)
2011 2803 TWhe (Ukraine, Russia, Japan, France have room for operational improvements. They are investing in this effort. Only factored in as offsetting any random production problems)
2012 2885 TWhe
2013 2964 TWhe
2014 3120 TWhe

the Dittmar bets on power generation and uranium production











The data to be used in determining this bet are the figures of the World Nuclear Association for the year, compared to the midpoint of the range. The amount of production for each year is expected to be published the following year. If the amount of the production is above the midpoint, Brian Wang is right, and advancednano is right and the winner; below the midpoint Dittmar is right and the winner for that year. The figure is the TWH level of generation of commercial nuclear fission or nuclear fusion.



Dittmar's table of Uranium production and power generation

















Brian Wang Uranium Production Projection


I accepted a bet for 2009 at 47,383 tons as over under (the actual midpoint). More uranium production means I win and less means Dittmar wins.

Dittmar offered a bet for 2010 uranium production not being higher 47,000 tons for world prod. I accepted the bet at 54,000 tons on physorg (comment section). The standard midpoint would have been 50,500 tons for 2010.




Your Old Purchases of Windows and Microsoft Office will Buy $10 billion in Vaccines and Save 8 Million lives

Bill and Melinda Gates said on Friday they would spend $10 billion over the next decade to develop and deliver vaccines

Over the past 10 years, the Microsoft co-founder's charity has committed $4.5 billion to vaccines and has been instrumental in establishing the GAVI alliance, a public-private partnership that channels money for vaccines in poor countries.

By increasing immunization coverage in developing countries to 90 percent, it should be possible to prevent the deaths of 7.6 million children under five between 2010 and 2019, Gates told reporters at the World Economic Forum.

More cash is now needed to make the most of new vaccines becoming available, including ones against severe diarrhea and pneumococcal disease from GlaxoSmithKline, Merck and Pfizer.

"We can take immunization to the next level, with the expanded uptake of new vaccines against major killers such as pneumonia and rotavirus diarrhea," Chan said in a statement.

She said an extra two million deaths in children under five could be prevented by 2015 by widespread use of new vaccines and a 10 percent increase in global immunization coverage.

Further off, Glaxo is also in the final phase of testing a vaccine against malaria that Gates said could slash deaths from the mosquito-borne disease.

This project could reduce the death rate worldwide by 1.5%. There are about 57 million deaths each year. So about 570 million deaths would be expected from 2010-2019. (actually a bit more as the population ages, the death rate will increase unless medicine continues to improve.)



Diarrhea
More than 80 percent of child deaths due to diarrhea occur in Africa and South Asia and just 15 countries account for almost three quarters of all deaths from diarrhea among children under five each year. India has the highest number of annual deaths at 386,600. Some 1.5 million children die each year from diarrhea, -- more than AIDS, malaria, and measles combined. Diarrhea causes one in five child deaths across the world but getting important vaccines to Africa and Asia could help save many lives.

So the Bill Gates Vaccine program estimate of 8 million lives saved does not include widespread effective diarrhea vaccines which could save 2 million lives per year. It would take time to deploy the diarrhea vaccines.

Pneumococcal vaccine information from the World Health Organization

Immunity following pneumococcal disease is directed primarily against the capsular serotype involved. The currently licensed pneumococcal vaccine is based on the 23 most common serotypes, against which the vaccine has an overall protective efficacy of about 60%–70%.

Acute respiratory infections kill an estimated 2.6 million children under five years of age annually. The pneumococcus causes over 1 million of these deaths, most of which occur in developing countries, where the pneumococcus is probably the most important pathogen of early infancy

90% coverage with existing Pneumococcal vaccine could save about 500,000 lives per year. Better pneumococcal vaccines (say with 95% effectiveness against all types of pneumococcus) could save 950,000 lives each year.

Infectious Disease

Infectious diseases kill more than 14 million people per year—around a quarter of deaths worldwide. For many of these diseases, cost-effective drugs and vaccines do not currently exist.

A UCLA summary of diseases Effectively vaccinating against the top 5 diseases could save up to 10 million lives each year. 100 million lives over a decade.

ACUTE RESPIRATORY INFECTIONS (2005)
4.0 million deaths from acute respiratory infections, children under 5 years. 55% occur in first year of life

Leading agents
Streptococcus pneumoniae
Influenza A and B
Haemphilus influenzae
Parainfluenza
Respiratory Syncytial Virus
Measles
Adenoviruses

Diseases
Pneumonia
Acute bronchitis
Bronchiolitis
Acute obstructive laryngitis

HIV/AIDS (2007)
2.0 million HIV/AIDS-related deaths
30 - 36 million living with infection
2.7 million new infections
Largest cause of death from a single pathogen
Sub-Saharan Africa accounts for 75% of AIDS deaths
Surveillance figures are far from complete
Millions to billions of viral genotypes
Antiviral therapies expensive / resistance increasing
Vaccines trials in progress / promise?

DIARRHEAL DISEASES (2000)
2.5 million deaths from diarrheal diseases, children under 5 years.

Leading pathogens
Rotavirus (children under 2 years of age in developing countries)
Vibrio cholerae

Other pathogens
Escherichia coli O157:H7

TUBERCULOSIS (2007)
1.8 million deaths from Mycobacterium tuberculosis in the world.
9.3 million new cases of tuberculosis.
1.7 billion people are or have been infected with Mycobacterium tuberculosis.

MALARIA (2006)
1.0 - 3.0 million deaths worldwide per year
Vast majority of malaria deaths occur in Africa.
40% worlds population exposed to malaria

HEPATITIS B (2006)
0.6 - 1.0 million carriers die each year
2 billion people alive today have been infected with hepatitis B virus




Obama Will Request Tripling Loan Guarantees for Nuclear Reactors

Blomoberg reports - President Barack Obama, acting on a pledge to support nuclear power, will propose tripling loan guarantees for new reactors to more than $54 billion, two people familiar with the plan said. This is expected to be in the 2011 budget. If passed and the loans made promptly this should ensure that there will be ten reactors built in the USA by 2020. Those first ten new reactor builds will remove most of the uncertainty about how much the actual build cost will be for new reactors in the USA.

Idaho Samizdat has coverage






January 28, 2010

Young Blood Can Reverse Circulatory System Age in Older Mice

New research from Harvard University, an unspecified factor in the blood of young mice can reverse signs of aging in the circulatory system of older ones. It's not yet clear how these changes affect the animals' overall health or longevity. But the research provides hope that some aspects of aging, such as the age-related decline in the ability to fight infection, might be avoidable.

Identifying those factors could lead to new strategies to boost resistance to infection, and perhaps a decrease in some cancers, she said. The findings, published today in the journal Nature, and which follow similar results with muscle stem cells, also suggest that the regenerative capacity of stem cells is highly influenced by their environment, which could have both positive and negative implications for regenerative medicine


One theory for aging is that our stem cells eventually wear out, thanks to intrinsic changes within the cells. While previous research supports this idea, findings from Wagers and others show that the age-related decline in stem cells is also influenced by external forces. For example, exposing skeletal muscle to blood-borne factors from young mice can restore the regenerative capacity of muscle stem cells.

The regenerative power of young blood appears to be mediated by osteoblasts--bone-forming stem cells previously shown to play a role in regulating blood-forming stem cells. Researchers found that osteoblasts from old animals can make blood-forming stem cells from young mice act old. And conversely, surgically exposing old mice to young blood rejuvenates aged osteoblasts, restoring their capacity to properly regulate blood-forming stem cells.

Nature - Systemic signals regulate ageing and rejuvenation of blood stem cell niches

Ageing in multicellular organisms typically involves a progressive decline in cell replacement and repair processes, resulting in several physiological deficiencies, including inefficient muscle repair, reduced bone mass, and dysregulation of blood formation (haematopoiesis). Although defects in tissue-resident stem cells clearly contribute to these phenotypes, it is unclear to what extent they reflect stem cell intrinsic alterations or age-related changes in the stem cell supportive microenvironment, or niche. Here, using complementary in vivo and in vitro heterochronic models, we show that age-associated changes in stem cell supportive niche cells deregulate normal haematopoiesis by causing haematopoietic stem cell dysfunction. Furthermore, we find that age-dependent defects in niche cells are systemically regulated and can be reversed by exposure to a young circulation or by neutralization of the conserved longevity regulator, insulin-like growth factor-1, in the marrow microenvironment. Together, these results show a new and critical role for local and systemic factors in signalling age-related haematopoietic decline, and highlight a new model in which blood-borne factors in aged animals act through local niche cells to induce age-dependent disruption of stem cell function.

9 page pdf with supplemental information



Inertial electrostatic confinement for cheap energy and Space Propulsion

George Miley has been researching nuclear fusion propulsion and has designs for inertial electrostatic space ships.

This article previously discussed the funding of an organization that is investigating nuclear fusion for space propulsion. This information is being directly verified and correct information will be posted as soon as possible.

The information in this article on George Miley's earlier work is correct.




- Inertial Electrostatic Confinement (IEC) fusion power using either D-He3 or P-B11 fuels can provide a high-power density fusion propulsion system for deep space missions, but a large multi-GW thruster is required which is a long term development program.

- As a first step, we propose a progression of near-term IEC thrusters, starting with a 1 -10 kWe electrically-driven IEC jet thruster for satellites followed by a small 50-100 kW IEC fusion thruster module for next generation large deep space spacecraft.

- The initial electrically-powered unit will be a novel multi-jet plasma thruster based on a spherical IEC technology using electrical input power from a solar panel. It would offer major advances in system power density and eliminate use of increasingly scarce fuels like Xe.


IEC is ideally suited for burning advanced fuels (D-He3 and p-B11) due to
beam-like energy and low electron temperature. Energetic fusion products escape the well and can undergo direct conversion to electricity

Star Mode Inertial IEC is simple and lightweight Beams focus through large openings – minimizes interception of grid wires and gives a good focus despite deviations from spheroid shapes.

- The preliminary design of a small 100-kWe p-B11 space power unit is available along with the possible extension to a thruster.

- Difficulties to be addressed include being able to scale up to higher
powers needed for more aggressive future missions. The objective
would be to use a fusion powered IEC for next generation power
units in the 100-kWe range to replace current HCTs.

- This multi-jet plasma thruster offers other advantages for the next
step. The technology underlying the electrically-driven IEC jet unit
underpins the development of a next generation fusion driven unit for
larger deep space spacecraft.

• Modular approach takes advantage of inherently small size of IEC units
• In a Magnetically Channeled Spherical Array (MCSA) Linked units
have improved confinement and guide fusion products out.
• Null-field region created within each pair of Helmholtz coils confine
plasma (confined by peripheral magnetic field and no grid structure)
• Cusp field configuration provides fluid stability
• Radial leakage is recirculated back into confinement region
• Axial leakage is retrapped in neighboring cells



George Miley Background

Professor Miley is internationally recognized for his innovative work on advanced fuel ICF target physics, and for contributions to innovative plasma devices, including the first direct electron-beam-pumped laser, and first visible nuclear-pumped laser, a flowing plasma focus, and the STAR mode IEC as a fusion neutron source for NAA. He and his students performed one of the first series of target compression experiments at the U of Rochester Laser Laboratory. Professor Miley has also made important research contributions to the field of nuclear engineering, ranging from fission reactor kinetics to direct radiation energy conversion and fusion technology. As a result of his seminal book, “ Fusion Energy Conversion” (1976), he is known as the "father" of advanced fuel fusion. His pioneering work on Nuclear Pumped Lasers opened up that field in the 1980s and gained international recognition.

Professor Miley is one of the most prolific researchers in the University of Illinois College of Engineering. Among his published works have been six books, over 230 articles in journals, and another 550 articles in conference proceedings. As a Director of NPRE's Fusion Studies Laboratory, Professor Miley's interests have ranged from fusion science and technology to direct radiation energy conversion. He is considered a pioneer in nuclear-pumped laser research and is widely recognized for innovative research in fusion. Professor Miley holds 19 patents.

Professor Miley is the author of over 190-refereed technical papers and is the editor or co-editor of a dozen books and proceedings. He is a Guggenheim Fellow, Fellow of four professional societies (ANS, IEEE, AAIA and APS), and holds the prestigious Preparata and Edward Teller Medals.

• A pioneering book on Direct Conversion of Nuclear Radiation Energy, a work that initiated the field of nuclear batteries.

• The first electron beam diode pumped laser (1969).

• The first visible Nuclear Pumped Laser (1976).

• A Seminal book, Fusion Energy Conversion (1976), that initiated serious research on advanced fuel fusion.


Philo Farnsworth, inventor of electronic TV, first proposed what we call Inertial Electrostatic Confinement (IEC) fusion in the 1960s. Early theory and experiments were supported by DOE, but then abandoned as increased effort went into magnetic confinement systems like Tokamak.

However, interest in IECs was revived by R. W. Bussard's concept for a magnetic-assisted IEC in the 1990s, followed by Miley's development of a small gridded device operating in the “Star” mode. The Star mode IEC was used commercially by an automobile company as a D-D neutron source for neutron activation analysis in industrial quality control.

Recent work by Miley and several other researchers has focused on fusion power applications with aneutronic fusion burning p-B11 as the ultimate goal. The IEC power plant potentially offers the very important advantages. These include a simple mechanical structure, a very high power-to-weight ratio, a velocity-space confinement scaling which offers modest size units, and a non-Maxwellian distribution enabling aneutronic fusion.
Successful scale up of present experimental devices to energy breakeven faces crucial physics issues including stability of the electrostatic well structure, prevention of space charge build-up effects and improved ion confinement time.

• The first comprehensive theory for solid-state gamma battery (1980).

• Development of the concept and detailed physics for a spark ignited inertial confinement fusion target using burn propagation into deuterium (1990).

• Discovery of Star Mode operation for inertial electrostatic confinement devices, opening the way to small lab scale neutron sources and industrial applications (1994).

• Theory and experiments in low energy nuclear reactions created in multi-layer thin-film electrodes (1997).

• Theory and experiments on a unique phonon-driven solid state x-ray laser (2002).

• Concept of an inertial electrostatic confinement neutron source driven sub-critical fission reactor for use in student laboratories (2003).

• Co-inventor of a regenerative fuel cell that employs hydrogen peroxide and offers major advantages for applications (2004) space power.

Video with George Miley talking About a Fusion Torch to Breakdown unwanted Material






Optomec Can Now Print Touch Screen Displays


Optomec has a new Aerosol Jet Display Lab system

Optomec is an additive manufacturing leader for high performance applications.

* The Aerosol Jet Solar Print Engine in their back end manufacturing line is capable of processing 2400 wafers/hour. (Solar Cells with 20% Efficiency)

* researchers have developed the ability to fully print thin-film transistors (TFTs) with operating frequencies exceeding 5 GHz built from single-walled carbon nanotube (SWNT) ink from Brewer Science, and printed using Optomec’s Aerosol Jet system

* Optomec systems can help make 3D interconnects

The system is being used today by these companies to develop applications such as bridge/jumper circuits for bus lines on ITO/Glass, edge circuits for handheld displays, and fully printed Thin Film Transistors. The benefits of the patented Aerosol Jet Direct Write technology are its multi-material, fine line (<10 um) printing capability which eliminates many process steps/costs associated with current photolithographic and vacuum based display manufacturing processes. Also, the Aerosol Jet Print Engine can be integrated into automation platforms to meet high volume display production requirements. Multi-nozzle dispensing heads can be configured to meet specific end-user throughput needs. The additive process employed by Aerosol Jet technology reduces environmental impact by minimizing waste and chemicals that are prevalent in traditional electronics’ manufacturing processes.



The Aerosol Jet Display Lab system is an ideal platform for developing next generation display products. With an expanded work envelope, the system enables printing on a wide variety of flexible and rigid substrates up to GEN 2 size. The system is equipped with patented Aerosol Jet technology enabling high resolution deposition of a wide variety of materials including conductive nano-particle inks, insulators, dielectrics, polymers, adhesives and other advanced materials used to fabricate display products. The Aerosol Jet process is a breakthrough deposition technology enabling finer feature sizes than traditional ink-jet or screen print processes. The Aerosol Jet process utilizes an innovative Direct-Write, aerodynamic focusing technology that produces high resolution features as small as 10 microns without the need for masks or secondary processing steps. And when it’s time to move into volume production, Aerosol Jet technology is available in standard and custom multi-nozzle dispensing configurations to meet your throughput requirements.




IBM Researchers make Better Graphene FETs with Commercializable Bandgap Features



Graphene computer components holds the possibility of increasing computer speeds into the 1-100 terahertz range. This is a significant advance towards making that and other graphene electronic and computer applications possible.






Nano Letters - Graphene Field-Effect Transistors with High On/Off Current Ratio and Large Transport Band Gap at Room Temperature

Graphene is considered to be a promising candidate for future nanoelectronics due to its exceptional electronic properties. Unfortunately, the graphene field-effect transistors (FETs) cannot be turned off effectively due to the absence of a band gap, leading to an on/off current ratio typically around 5 in top-gated graphene FETs. On the other hand, theoretical investigations and optical measurements suggest that a band gap up to a few hundred millielectronvolts can be created by the perpendicular E-field in bilayer graphenes. Although previous carrier transport measurements in bilayer graphene transistors did indicate a gate-induced insulating state at temperatures below 1 K, the electrical (or transport) band gap was estimated to be a few millielectronvolts, and the room temperature on/off current ratio in bilayer graphene FETs remains similar to those in single-layer graphene FETs. Here, for the first time, we report an on/off current ratio of around 100 and 2000 at room temperature and 20 K, respectively, in our dual-gate bilayer graphene FETs. We also measured an electrical band gap of >130 and 80 meV at average electric displacements of 2.2 and 1.3 V nm−1, respectively. This demonstration reveals the great potential of bilayer graphene in applications such as digital electronics, pseudospintronics, terahertz technology, and infrared nanophotonics.

University of California at Berkeley had announced in mid-2009 that they had created a tunable bandgap in graphene



Room temperature on/off current ratio of 100 is by no means the upper limit of the graphene FET.

In summary, we demonstrated a bi-layer graphene transistor with an on/off current ratio of around 100 at room temperature. The transport measurement indicates a Schottky barrier height >65 meV at Dave of 2.2 Vnm-1, corresponding to an electrical (transport) bandgap of >130 meV. At 20 K, a device on/off current ratio of about 2000 is demonstrated at Dave of 1.3 Vnm-1. Revealing of the large electrical bandgap in bi-layer graphene may enable a number of novel nanoelectronic and nanophotonic applications.

The fabrication steps of the dual-gate bi-layer graphene field effect transistor (FET) are described as follows:

1. Identification of bi-layer graphene flakes using optical approach and Raman spectroscopy. The bi-layer graphene flakes in this experiment were purchased from Graphene Industries, Inc.

2. First e-beam lithography and source/drain metallization (Ti/Pd/Au/Ti: 0.5/20/20/5 nm).

3. Second e-beam lithography and patterning of the bi-layer graphene channel.

4. Spin coating of the organic seed layer made from a derivative of polyhydroxystyrene (the polymer NFC 1400-3CP manufactured by JSR Micro, Inc.) for atomic layer deposition (ALD). The layer thickness can be adjusted by spin speed. The dielectric constant of this material is about 2.524.
5. Atomic layer deposition of top gate oxide (HfO2) at T < 2000C. 6. Third e-beam lithography and top gate metallization (Ti/Au: 5/25 nm). Poly methyl methacrylate (PMMA) was used as the e-beam resist in all the processing steps mentioned above. Removal of PMMA was realized using acetone and usually was followed by isopropanol rinse. No specific surface cleaning steps were involved in the processing.


FURTHER READING
A list of recent Mesoscale and Nanoscale Physics papers in arxiv







Catastrophe Model of Haiti Earthquake Casualties


The Haitian earthquake caused an estimated 250,000 fatalities—and disease, starvation and lack of medical care could push the death toll higher, a catastrophe modeling firm said.

Risk Management Solutions in Newark Calif., said the quake, which destroyed more than 4,000 buildings in Port-au-Prince alone, had limited impact on the insurance industry, but raises questions about the earthquake risk across the Caribbean, the potential of an earthquake on nearby faults, and what lessons can be drawn from an event such as this.

12 page pdf - RMS FAQ: 2010 Haiti Earthquake and Caribbean Earthquake Risk

RMS estimates approximately 250,000 fatalities as a result of the 2010 Haiti Earthquake. This is a best estimate based on the limited data available within 36 hours of the occurrence of the event and the immediate impacts of the earthquake—primarily building collapse. This preliminary estimate of 250,000 casualties could potentially increase over the coming weeks due to compounding factors, such as the spread of infectious diseases, lack of food and water, and limited access to medical care. On January 20, 2010, a representative of the aid group Partners in Health estimated that up to 20,000 people were dying each day due to the lack of medical care (specifically medical operations).

• Over 90 percent of the walls of Haiti’s buildings are constructed using either concrete/blocks, earthen materials, woven wood mats, or bricks and rocks. These heavy materials used to construct the walls, often with no reinforcement, caused numerous building collapses, resulting in extensive property damage and loss of life.

• While the primary damage from an earthquake is due to ground shaking, secondary hazards are phenomena that can cause additional loss to people and property at risk. The most relevant secondary hazards are liquefaction and landslide—both of which played a role in increasing Haiti’s damage and loss.



• According to a stress analysis, the latest quake has loaded up pressure on adjacent fault lines to the west of Port-au-Prince along the Enriquillo fault, which will be enough to trigger earthquakes on the adjacent segments—“particularly if those fault segments were close to failure prior to the January 12 earthquake.” RMS said.

• Stress calculations indicating a clustering of aftershocks at the western end of the rupture are reasonable since no significant earthquake has occurred along the adjacent segments in the last 150 years, the firm said.

• Of particular concern following the Haiti Earthquake is the damage to informal housing in shanty towns on the outskirts of Port-au-Prince. The firm noted that as rural poor migrate to the major cities, they often take up residence in shanty towns built with substandard construction that cannot stand up to the natural hazards that are present across so many capital cities—from hurricanes to landslides and earthquakes.

RMS said it is currently carrying out a new type of collaborative model development effort designed to quantify the economic and humanitarian impacts of future earthquakes on capital cities in developing countries, with South America as an initial test case.

According to the Institut Ha├»tien de Statistique et d'Informatique (IHSI or Haitian Institute of Statistics and Informatics) (IHSI, 2010), over 70% of the country’s building stock is low rise (i.e., one story in height). L’ajoupas or cottages (translated “country homes”) represent over 15% of the country’s construction, with much higher concentrations in rural regions (e.g., 92.5% in rural and 7.5% in urban regions). Multi-story buildings represent less than 10% of Haiti’s property at risk and are concentrated in urban regions.

Over 90% of the walls of buildings are constructed using one of four material(s): concrete/blocks, earthen materials, clisse (translated “woven wood mats”), or bricks/rocks; with all materials, there is often no reinforcement (e.g., steel rebar). In rural regions, earthen materials are most common; in urban regions, concrete/blocks are utilized for close to 80% of the built walls. Similar patterns are seen in flooring materials, with hard-packed earth in rural regions and concrete in urban regions. Close to 70% of roofs are constructed using light metal (i.e., tin).




China Yuan and US Dollar News

Wall Street Journal coverage of the G7 meeting and currency discussions

Alan Ruskin, who heads foreign exchange strategy at RBS Securities, notes there is more evidence of a strengthening U.S. dollar than there is to indicating a weakening in the currency. "The dollar's rallied quite sharply. There's a sense that as the U.S. economy is recovering it's hard to conceive of a dollar collapse," he said.

Ruskin said he sees China's recent moves to scale back bank lending as a sign the country could move toward using exchange policies as another tool to tighten monetary policy and fight inflation.

Vice-Minister for Commerce Zhong Shan said in a statement on the ministry's website -

International pressure for the yuan to rise is growing; there are strong expectations for yuan appreciation. Zhong said expectations for a stronger yuan were one of the factors that could weigh on China's exports in 2010, in addition to uncertainties about global economic recovery and trade disputes.

These semi-official comments from a Chinese Minister could indicate that China will move earlier than this summer to begin re-appreciating the yuan versus the US dollar. It could also signal an increased likelihood of something in the range of 10% one off apprecition.



WSJ discusses one off appreciation versus gradual

The 23% surge in China's foreign reserves last year to $2.4 trillion is a much clearer signal that its currency is undervalued.

China rightly worries that a new program of gradual appreciation would only encourage more hot money inflows.

So the alternative is to do it all in one bang. To be sure, a one-off sharp revaluation would strike a major blow at China's export base and induce a significant slowdown in the economy. But since China's competitors would enjoy an exchange rate windfall, contagion would be limited.

Either way, doing nothing and so indefinitely postponing a more violent, involuntary bursting of the bubble is the worst option.




For and Against Apple iPad and Anticipating the Google Response

Gizmodo has complaints about the Apple iPad.

* the large bezel
* no cameras builtin - there is the camera connection system
* on screen keyboard - there is a docking station and keyboard accessory

* as previously noted at this site - no flash, no HDMI out, no multi-tasking

NPD Group complaint via Forbes
* no changes yet to the purchase model of TV shows or movies through the iTunes store

Not a Total Kindle (e-reader) Killer

IEEE Spectrum indicates that the iPad is not a Kindle Killer and feels that the iPad display will prevent it from dominating the e-reader category.

However, the iPad is probably a good enough e-reader. Being able to read for a couple of hours on the iPad and being able to use all of the other uses will probably limit the market potential of the dedicated e-reader. As the price of color e-ink or OLED comes down will mean that in a few years the dedicated e-reader will have a smaller and smaller niche.

Apple’s choice to go with LCD technology isn’t particularly surprising; the iPad will be used to display photos and videos, and to do that needs a full-color, full-motion display. So e-ink and its monochrome brethren are out. OLED technology, right now, is just too expensive. And Pixel Qi is a compromise; it gives up a bit in color saturation to pick up that visibility in sunlight. Steve Jobs isn’t one to compromise.

But the choice of LCD technology means that, in spite of the library of e-books that will be available for the iPad, this device no e-book reader.

Over at the Fuji Xerox Palo Alto Laboratory, a group of scientists looking at how best to read and navigate electronic documents on portable devices is also encouraged by the iPad. While the current reading applications don’t go beyond the state of the art, says researcher Scott Carter, “the form factor coupled with the screen capabilities should facilitate new media-rich reader applications as well as interactive collection browsing apps” that will make all our lives easier.

It’s not a printed book killer—or a Kindle killer. But, to be fair, it doesn’t have to be to succeed, it’s a sweet computer, certainly more appealing than a netbook


Apple iPad Kool-aid from the Designer and Executives who Developed it

Ten minute video-


Google and the iPad

Even through there are complaints about the iPad, it does open up the tablet category to far more competition and innovation.

Eric Schmidt has talked openly of creating a powerful and cheap netbook computer by late 2010. Judging from his words, the Google netbook (or, given the way fashions are trending, perhaps now a tablet) will be priced far below Apple's range of $499 to $829.

Apple, long a seller of hardware, is thinking in terms of something cheaper than a laptop, or better than a netbook. Google sees its device as a means to accessing its main businesses of Internet search and, increasingly, Internet-based office applications like word processing. The Google machine might even be subsidized like a cellphone, thrown on at a deep discount for a subscriber to Google Apps.




January 27, 2010

Carnival of Space 138

The Carnival of Space 138 is up at Nancy Atkinson's blog. Nancy is the Senior Editor for Universe Today, producer for Astronomy Cast, and project manager for 365 Days of Astronomy podcast. Also, she is a NASA/JPL Solar System Ambassador.

I supplied my article which reviewed the Quicklaunch, sea based hydrogen launch gun system. I note that Quicklaunch provides evidence that my nuclear cannon version of Project Orion has projectile configurations that would work.

Centauri Dreams talks about nuclear fusion powered spacecraft and the new Icarus interstellar spacecraft design project



Spacewriter ramblings discusses alien worlds in science fiction movies

Universe Today had an article on the NASA Puffin electric aircraft

We are all in the Gutter blog looks at weird objects that have been spotted by the Kepler Space Telescope.

Check out the Carnival of Space 138 at Nancy Atkinson's blog for more.




What Would it Take for a Yottawatt Civilization

There is a new website yottawatts from Thorium, which prompted this post to answer how in general to get yottawatts from fission, fusion or solar

Here are the units prefix definitions from wikipedia.

10^3 W kW kilowatt 
10^6 W MW megawatt 

10^9 W GW gigawatt  (current large hydro have 1 to 20 gigwatts. Nuclear and coal plants can have 1-2 gigawatts )

10^12 W TW terawatt (The total power used by humans worldwide (about 16 TW in 2006) is commonly measured in this unit.)

10^15 W PW petawatt (the total energy flow of sunlight striking Earth's atmosphere is estimated at 174 PW)

10^18 W EW exawatt 

10^21 W ZW zettawatt 

10^24 W YW yottawatt (the Sun outputs approximately 386 Yottawatts)

A yottawatt civilization would be roughly equivalent to a Kardashev 1.75 civilization. Kardashev level One is able to access and use all of the energy of the sun that strikes the earth. Kardashev Level Two is able to use all of the energy of the sun.



Power Sources Super Solar, Fusion and Deep Burn Fission

Previously this site has discussed deep burn nuclear fission and using Uranium from the Ocean. Deep burn would mean using nuclear fuel about 100 times more efficiently than we do now. Burning all of the Uranium, Plutonium or thorium completely. There would only be waste material that has half life of less than 30 years left.

With deep burn and maximum mining of uranium and thorium we could get 1 million times more nuclear fission power than we do now. 100 times more efficiency and 10,000 times more material. 600 million tons per year.

So scaling up from about 400 gigawatts up to 400 petawatts. This rate of usage would go through the uranium in the ocean in about 6 years and would be going through the uranium and thorium in the crust at very good clip.

There is an estimated 40 trillion tons of Uranium and 120 trillion tons of thorium in the Earth's crust.

So it would take about 267,000 years to go through all of the uranium and thorium at the 400 petawatt rate.

There is an estimated of 600 trillion tons or 12 times the amount in the earth's crust for Uranium not in the Sun in the solar system.

So going up to 1 zettawatt would use up the fissionable nuclear material in about 1000 years. Going up to 1 yottawatt would use it up in one year.
If the ratio of thorium to uranium held in the solar system as it does for the crust then there would be about 4 years of yottawatt power using all of the thorium and uranium that is not in the sun.

Nuclear Fusion

Nuclear fusion appears to be on the verge of a breakthrough

About 1 in 6500 hydrogen atoms in seawater is deuterium. Deuterium abundance on Jupiter is about 2.25×10^−5 (roughly 22 atoms in a million, or 15% of the terrestrial deuterium-to-hydrogen ratio.

There is enough water in the ocean to provide energy for 3 X 10^11 years at the current rate of energy consumption. The availability of Lithium on land is sufficient for at least 1000 if not 30000 years, and the cost per kWh would be even smaller than that of Deuterium. If the oceans is included it is estimated that there is enough fuel for 3 X 10^7 years.

16 Terawatts X 3 X 10^11 years OR
16 Petawatts X 3 X 10^8 years OR
16 Exawatts X 3 X 10^5 years OR
16 Zettawatts X 300 years OR 1 Zettawatt for 4800 years OR
1 Yottawatt for 4.8 years

This is only using the ocean. Using Jupiter for nuclear fusion fuel would enable billions of years of yottawatt energy.

UPDATE -
A reader made the detailed calculations while I had made some assumptions in a rough calculation which were not correct.

Jupiter's upper atmosphere is composed of about 88–92% hydrogen. The interior contains denser materials such that the distribution is roughly 71% hydrogen. Mass 1.8986×10^27 kg, 317.8 Earths or 1/1047 of the Sun.

1.9 * 10^27 * 22* 10^-6 = 4.18*10^22 kg = 4.18 X 10^19 tons X.9 = 3.6 X10^19

Deuterium ratio of the solar system. Higher ratio of deuterium in Uranus and Neptune than Jupiter and Saturn

Gaseous Planets (Deuterium parts per million)
21 ± 8 H2 Jupiter (spectroscopic) =1
26 ± 7 H2 Jupiter (MS in situ) =1
15–35 H2 Saturn (spectroscopic) =1
65 (+2.5/–1.5) H2 Neptune (spectroscopic) 2.6
55 (+35/–15) H2 Uranus (spectroscopic) 2.2



hydrosphere of earth 1.5×10^18 short tons
hydrogen 1.67X10^17
deuterium 2.5X10^13 tons

Here is a posting on another site with an attempt to calculate sustainability of Kardashev civilizations

Europa's oceans of 3 × 10^18 m3, slightly more than two times the volume of Earth's oceans.

Uranus (14.5 X 2.2 earth masses), Neptune 17 X (2.6 higher deuterium ratio) earth masses,
Saturn 95 earth masses,
171 earth mass equivalents with Jupiter Deuterium ratio
So maybe 10 million years of Deuterium fusion

The advanced civilization would have to figure out how to make proton-proton fusion (regular hydrogen) work if they wanted to make non-solar fusion last. Proton-proton fusion does occur in the Sun.

Proton-proton fusion is not one of the reactions that is considered a candidate for planetary based nuclear fusion However, if you are not planetary based and have enough technology then you probably can find a way.

Super solar Energy

As noted by the definitions, capturing and using 0.3% of the energy from the sun enables a yottawatt civilization.

Leaked AppleTablet Details and Official Unveiling of the iPad


UPDATE- Steve Jobs in on stage now (10AM PST , 1 PM EST).
Author Comment/Review - Based on the feature demo, specs and $499 starting price. The iPad does look like a monster winner and one that lives up to the pre-release hype. Do you get in line to try to get this one now ? Or wait for next year when they have the iPad 4G ?

PC World notes several disappointments in the iPad features

* No multitasking (also noted in comments here)
* No handwriting recognition
* insufficient interface innovation to fully leverage screen size
* lack of flash support
* missing camera and inadequate video support
* and a list of other items

The official Apple iPad site with more pictures and video

We want to kick off 2010 by introducing a truly magical product today.

* 250 million iPods sold since 2001
* over 140,000 applications in the App Store
* a user downloaded the 3 billionth app from the App Store
* Wall Street Journal quote- "The last time there was this much excitement about a tablet there were some commandments written on them"
* a device between laptops and smartphones
* a device for web. Email. Photos. Videos. Music. Games. eBooks
* netbooks are not better than anything. Just cheap laptops
* it is called iPad
* there are interface enhancements that go beyond just supersizing an iPhone.
* it is primarily a touch screen interface
* there are map modes where many pictures are presented for selection or applications etc... depending upon the task that is active
* iPad seems able to do everything that an iPod or iPhone can do while taking advantage of the larger screen for high definition and superior interface/interaction

The iPad also comes with a connectable keyboard, docking station and leather carrying case that converts into a stand.

Enterprise Mobile Today reports that the price of the iPad starts at $499

* 9.7-inch screen with pixel-doubling technology to play back high-definition video
* 16 to 64GB of flash storage.
* Networking is support for 802.11n, Bluetooth and 3G wireless, which will be available on some models and supported via a carrier partnership with AT&T.
* half-inch thick, weighing just 1.5 pounds
* Apple A4 processor running at 1GHz
* 10 hours of battery life and one month of standby power

CNET also had liveblogging of the iPad event

Gizmodo live coverage.

Information Week reports on leaked details from Apple partners about the Apple Tablet

McGraw-Hill CEO Terry McGraw in an early morning interview with business channel CNBC - And the Tablet is going to be based on the iPhone operating system and so it will be transferable. So what you are going to be able to do now, we have a consortium of e-books, and we have 95% of all our materials that are in e-book format on that one. So now with the tablet you're going to open up the higher education market, the professional market. The tablet is going to be just really terrific.

Apple has scheduled a press conference in San Francisco for Wednesday at the Yerba Buena Center for the Arts Theater. The conference is set to get underway at 1 pm Eastern time.



3 minute Youtube of Steve Jobs Demonstrating the iPad (the first 10-15 seconds are a goofy intro)



The CNBC interview leak - but who cares now that the real info is out


FURTHER READING

Earlier rumors about the Apple Tablet

More rumors and mockup pictures at Thisismoney UK




January 26, 2010

Metamaterial Antennas 25 Times Smaller in Each Dimension


This Z antenna tested at the National Institute of Standards and Technology is smaller than a standard antenna with comparable properties. Its high efficiency is derived from the "Z element" inside the square that acts as a metamaterial, greatly boosting the signal sent over the air. The square is 30 millimeters on a side.

NIST engineers are working with scientists from the University of Arizona (Tucson) and Boeing Research & Technology (Seattle, Wash.) to design antennas incorporating metamaterials—materials engineered with novel, often microscopic, structures to produce unusual properties.

The new antennas radiate as much as 95 percent of an input radio signal and yet defy normal design parameters. Standard antennas need to be at least half the size of the signal wavelength to operate efficiently; at 300 MHz, for instance, an antenna would need to be half a meter long. The experimental antennas are as small as one-fiftieth of a wavelength and could shrink further.

In their latest prototype device,* the research team used a metal wire antenna printed on a small square of copper measuring less than 65 millimeters on a side. The antenna is wired to a signal source. Mounted on the back of the square is a “Z element” that acts as a metamaterial—a Z-shaped strip of copper with an inductor (a device that stores energy magnetically) in the center




The purpose of an antenna is to launch energy into free space,” explains NIST engineer Christopher Holloway, “But the problem with antennas that are very small compared to the wavelength is that most of the signal just gets reflected back to the source. The metamaterial makes the antenna behave as if it were much larger than it really is, because the antenna structure stores energy and re-radiates it.” Conventional antenna designs, Holloway says, achieve a similar effect by adding bulky “matching network” components to boost efficiency, but the metamaterial system can be made much smaller. Even more intriguing, Holloway says, “these metamaterials are much more ‘frequency agile.’ It’s possible we could tune them to work at any frequency we want, on the fly,” to a degree not possible with conventional designs.

University of Arizona Metamaterial Antenna page

RELATED RESEARCH

5 page pdf - Design and Experimental Verification of a 3D
Magnetic EZ Antenna at 300 MHz


Several variations of a 300-MHz version of the electrically small coax-fed three-dimensional (3D) magnetic EZ antenna were designed and tested. The final version of this low-profile antenna had an electrical size that was at 300.96 MHz. Nearly complete matching to the 50- source and high overall efficiency (nearly 100%) were achieved. The measured fractional bandwidth was approximately 1.66%. The numerically predicted and the measured results were in good agreement. Comparisons to similar-sized loop antennas that were matched to the source with both custom-made and commercially available, general purpose external matching networks confirm the performance enhancements achieved with this metamaterial-inspired, near-field resonant parasitic antenna.