Pages

May 12, 2012

Carbon nanotubes increase heat transfer of nanofluid coolants by 15%

Nanohex is the world’s largest collaborative project for the research and development of nanofluid coolants, NanoHex comprises of a consortium of 12 leading European companies and research centres. The €8.3 million project has been funded by the Seventh Framework Programme grant.

Led by Siemen’s AG, the NanoHex nanofluids will be applied to the cooling system of Insulated Gate Bipolar Transistors (IGBT), the power electronics modules used for the traction of high speed trains.

Power electronics control of the flow of power, shaping the supplied voltage by means of semiconductor devices such as IGBTs. Power electronics can help to increase the energy efficiency of equipment and processes that use electrical power and thus, the development boundaries of large electric drives are being continuously pushed to produce more powerful, reliable, durable, smaller, lighter, and less costly power electronics products. Current liquid cooling systems are limited as to how efficiently they may transfer heat without looking to increase in size or incorporate active refrigeration.

Data Centres account for 2% of global carbon emissions, a figure equal to that of air travel, with almost half of the energy consumed utilised for cooling.

An efficient cooling system is however, essential for a data centre, as thermal stress can directly impact performance, reducing throughput and reliability, and increasing the chance of electrostatic discharge, which may damage the equipment.

The NanoHex nanofluid will be applied to the cooling system of computer servers, racks and/or cabinets. Using a custom built cold plate, the coolant could be circulated through the data centre cabinets, adjacent to the server blades, in order to directly draw heat away from the processing chips. As several companies already use water as a coolant, there is a market for direct replacement, as well as a new cooling system.

The efficient removal of heat from computer servers, racks and cabinets would ease the demand for air conditioning in the server room decrease the size of chilling units and improve the performance of the system. All of which provide considerable cost savings.


International Journal of heat and mass transfer - Enhanced thermal conductivity of ethylene glycol with single-walled carbon nanotube inclusions

Metformin Review at FightAging

Fight Aging - Metformin is a drug that shows up in discussion here every so often. It is thought to be a calorie restriction mimetic, recapitulating some of the metabolic changes caused by the practice of calorie restriction. Its effects on life span in laboratory animals are up for debate and further accumulation of evidence - the results are on balance more promising than the generally dismal situation for resveratrol, but far less evidently beneficial than rapamycin. Like rapamycin, metformin isn't something you'd want to take as though it were candy, even if the regulators stood back to make that possible, as the side effects are not pleasant and potentially serious.

Even if the completely beneficial mechanism of action is split out from the drug's actions - as seems to be underway for rapamycin - the end results will still only be a very modest slowing of aging. You could do better by exercising, or practicing calorie restriction.

Metformin, an oral anti-diabetic drug, is being considered increasingly for treatment and prevention of cancer, obesity as well as for the extension of healthy lifespan. Gradually accumulating discrepancies about its effect on cancer and obesity can be explained by the shortage of randomized clinical trials, differences between control groups (reference points), gender- and age-associated effects and pharmacogenetic factors. Studies of the potential antiaging effects of antidiabetic biguanides, such as metformin, are still experimental for obvious reasons and their results are currently ambiguous.

...

The wave of interest, with periodical decays and increasing surges, was associated with the attempts to use antidiabetic biguanides [such as metformin] to control body weight and tumor growth.

Open Access Impact of Aging - Metformin in obesity, cancer and aging: addressing controversies

China presses ahead with domestic high speed rail and exports of high speed rail equipment

1. China will continue with research and development into its new generation high-speed trains despite the industry's tarnished image due to a spate of operation faults last year, according to a plan for the country's rail traffic equipment manufacturing industry.

The new generation trains will run at speeds of more than 300 km an hour, according to the five-year plan for the industry for the 2011-2015 period, which was released by the Ministry of Industry and Information Technology on Monday.

On July 23 last year, a high-speed train slammed into a stalled train near the eastern city of Wenzhou, leaving 40 people dead and 172 injured. The incident was blamed on faulty signaling equipment.

Construction of high-speed trains and railways cooled sharply after the State Council, or China's cabinet, ordered slower operational speeds in the wake of the crash.

Trains with a maximum speed of 350 km per hour (kph) were ordered to run no faster than 300 kph, while those with a maximum speed of 250 kph had to run at no more than 200 kph.

Some analysts then predicted the accident would hamper the nation's exports of high-speed train technologies.

But contrary to these concerns, China has continued to export a wide range of equipment including electric multiple units, urban rail vehicles, steam locomotives, large road maintenance equipment to many countries such as Russia, Australia, Brazil, India, Argentina, Turkey, Iran and Malaysia.

May 11, 2012

Lighter, stronger, better graphene reinforced polymers

A group of researchers at the College of William & Mary have made important advances in technology combining polymers—the material of the present—with graphene—the material of the future. Jaeton Glover, a post-doctoral chemist at William & Mary, explains that the group incorporates graphene oxide into polymers, a process that opens the door for a range of enhanced plastics that are super-strong as well as super-versatile.

“Polymers are something that’s all around us,” explained Schniepp. “Half of the stuff we have is polymers or plastics. The idea is that if we just add something to the polymer—like these small particles of graphene—we can add additional functionality.”

Structural strength is just one example of the functionality Schniepp mentioned. Graphene is a hundred times stronger than steel, in fact it’s the strongest material ever tested by man. Graphene oxide-reinforced polymers could open up a new range of strong, yet light, material possibilities.

Macromolecules - In Situ Reduction of Graphene Oxide in Polymers



US Manufacturing and China Innovation

1. MIT conference (The Future of Manufacturing in the U.S) explores the complex state of the US manufacturing industry which is showing signs of revival.

The United States added about 50,000 manufacturing jobs this January alone, the largest monthly gain since 1998. Companies such as Ford Motor Co. have moved overseas plants back to the United States. And high energy costs (which make global shipping more expensive), along with rising foreign wages in some industries, have provided reasons for companies to consider relocating their factories in America.

In this regard, “the situation is different than it was in the 1990s,” when the flow of jobs seemed only to move away from the United States, said David Simchi-Levi, professor of civil and environmental engineering and engineering systems and co-director of MIT’s Leaders for Global Operations (LGO) program, which hosted the event. About 43 percent of firms in one survey, he noted, would consider moving their factories back to the United States.

To be sure, manufacturing has seen major job reductions in the United States: from 18 million jobs in 2001 to 12 million today. Even so, the sector still accounts for 70 percent of private-sector R&D spending in America and 90 percent of U.S. patents issued today.

In addition to government incentives, Jimenez said, U.S. manufacturers must “build leadership in manufacturing innovation” themselves. As he detailed, Novartis, in collaboration with MIT researchers, is working to develop a new system of “continuous manufacturing” that would dramatically reduce the time it takes to produce commercial drugs.

It remains to be seen precisely which areas of technological research will provide the biggest platforms for economic growth. Olivier de Weck, an associate professor of aeronautics and astronautics and engineering systems at MIT and executive director of the Institute’s ongoing study of Production in the Innovation Economy (PIE), listed a series of promising research topics, including lightweight materials, flexible electronics, pharmaceuticals, rapid prototyping — such as 3-D printing — and the use of recycled materials for manufacturing.

2. EETimes - Despite its emphasis on “indigenous innovation,” China’s real competitive edge remains in what China watchers call “second generation innovation” that combines existing technologies and products with growing manufacturing prowess.

Why Space ? Everyone does want more. What Abundance really means and looks like

Gregor MacDonald claims to be an oil analyst and wrote at Zero Hedge a couple of days ago. I had already corrected this "oil analyst" that crude oil did not reach a ceiling in 2005 and has increased by 3-4 million barrels per day and oil liquids has increased by almost 6 million barrels per day.

Gregor asked the question - why is Diamandis thinking about mineral mining in space, when resources here on Earth -- in his view -- are so abundant?

Summary

Oil in place in the continental US is from about 3 trillion to 5 trillion barrels of oil not including the 4.5 trillion barrels of oil shale.

There is enough and the technology will be here to get it and use it affordably. However, the technology to clean has historically lagged and that would be a big problem.

Everyone in the developing world wants to at least catch up the US and Europe. So in 2050 that means about 10 billion people with $100,000 per person. So a $1000 trillion global economy. We have demonstrations here where the 99% want to get to the 1% level of income.

For the US, getting to where the bottom level of the current 1% is about $506,500 per year.

Everyone getting to that level is a $5000 trillion economy. The world is currently at an $82 trillion economy.

It is not just cornucopians who want more, clearly everyone wants more. The developing world wants more and the 99% want more. However, most people have not thought it through or done the calculations. There are a bunch of knee jerk doomers who just assume it cannot be done and that they consider the current level too much. Many of the doomers have some kind of implicit or explicit mass death and poverty solution.

Technology and long term planning and execution of those plans can achieve true abundance for everyone.


Sure a lot of Earth-based resources but getting and using it here ?

Oil in place in the continental US is from about 3 trillion to 5 trillion barrels of oil not including the 4.5 trillion barrels of oil shale.

1.53 trillion barrels Piceance Basin of Colorado (USGS, June 2011 oil shale)
1.44 trillion barrels Green River formation (USGS, June 2011 oil shale)
1.32 trillion barrels for the Uinta Basin of Utah and Colorado. (USGS, June 2011 oil shale)
260-500 billion barrels Monterey Formation (tight oil)
271-503 billion barrels Bakken Formation (tight oil)
etc...

Aggressive use of new fracking technology and combined with fire flooding and water flooding could enable 20-30% recovery rates. Large amounts of the Oil shale is likely recoverable with fire flooding. So 6.5 trillion to 9.5 trillion barrels of oil, with 20-30% recovery rates is 1.3 to 2.8 trillion barrels of oil. Oil Shale like in the Green River Formation cannot be recovered with horizontal drilling. It will require fire flooding or some other likely insitu method.

Technology should make horizontal drilling three times cheaper.

Fire Flooding and other technology can make it very affordable to get oil from the oilsands.

Technology for affordably extracting 4 billion tons of uranium from seawater is close.

I did not have room in this article, but I have other articles that detail out increasing agricultural production, handling water issues, near term energy and whatever other "problem" doomers have.

There is enough and the technology will be here to get it and use it affordably. However, the technology to clean (pollution mitigation) has historically lagged and that would be a big problem.

Yet Clearly China and India and everyone else wants their share

Everyone in the developing world wants to at least catch up the US and Europe. So in 2050 that means about 10 billion people with $100,000 per person. So a $1000 trillion global economy. We have demonstrations here where the 99% want to get to the 1% level of income.



Teleporting independent qubits 97 kilometers opens the way to satellite based quantum communication

Technology Review - The ability to teleport photons through 100 kilometres of free space opens the way for satellite-based quantum communications, say researchers.

Juan Yin at the University of Science and Technology of China in Shanghai, and a bunch of mates say they have teleported entangled photons over a distance of 97 kilometres across a lake in China.

That's an impressive feat for several reasons. The trick these guys have perfected is to find a way to use a 1.3 Watt laser and some fancy optics to beam the light and receive it.

Inevitably photons get lost and entanglement is destroyed in such a process. Imperfections in the optics and air turbulence account for some of these losses but the biggest problem is beam widening (they did the experiment at an altitude of about 4000 metres). Since the beam spreads out as it travels, many of the photons simply miss the target altogether.

So the most important advance these guys have made is to develop a steering mechanism using a guide laser that keeps the beam precisely on target. As a result, they were able to teleport more than 1100 photons in 4 hours over a distance of 97 kilometres.

That's interesting because it's the same channel attenuation that you'd have to cope with when beaming photons to a satellite with, say, 20 centimetre optics orbiting at about 500 kilometres. "The successful quantum teleportation over such channel losses in combination with our high-frequency and high-accuracy [aiming] technique show the feasibility of satellite-based ultra-long-distance quantum teleportation," say Juan and co.


Bird's-eye view and schematic diagram for free-space quantum teleportation. a, Entanglement generation and distribution on Charlie's side. A near infrared pulse (788 nm) is focused on an LBO crystal to create an ultraviolet laser pulse, which is then focused with two cylindrical lenses (CL) and passed through a 2 mm nonlinear BBO crystal. By an SPDC process, an entangled photon pair is created. An interferometric Bell-state synthesizer is utilized to disentangle the temporal from the polarization information. While photon 2 is then directly sent to Alice for BSM, photon 3 is guided to a refractor telescope through a fiber and sent to Bob. A HWP sandwiched between two QWPs constitute the fi ber polarization compensation. Coaxial with the telescope, there is a green laser (532 nm) for system tracking and a red laser (1064 nm) for synchronization. The green arrows indicate the ne tracking system which consists of a four-quadrant detector and a fast steering mirror driven by piezo ceramics (PI). The blue arrows indicate the coarse tracking system which consists of a wide-angle camera and a two-dimensional rotatable platform. b, Initial state preparation and BSM on Alice's side.

Arxiv - Teleporting independent qubits through a 97 km free-space channel

Japan uses Nuclear Accelerator to Mutate Rice for Salt Tolerance

Economist Magazine - Those who turn their noses up at “genetically modified” food seldom seem to consider that all crops are genetically modified. The difference between a wild plant and one that serves some human end is a lot of selective breeding—the picking and combining over the years of mutations that result in bigger seeds, tastier fruit or whatever else is required.

Nor, these days, are those mutations there by accident. They are, rather, deliberately induced, usually by exposing seeds to radiation. And that is exactly what Tomoko Abe and her colleagues at the Riken Nishina Centre for Accelerator-Based Science in Saitama, outside Tokyo, are doing with rice. The difference is that Dr Abe is not using namby-pamby X-rays and gamma rays to mutate her crop, as is the way in most other countries. Instead she is sticking them in a particle accelerator and bombarding them with heavy ions—large atoms that have been stripped down to their nuclei by the removal of their electrons. This produces between ten and 100 times as many mutations as the traditional method, and thus increases the chances of blundering across some useful ones.

About a third of the world’s rice paddies have salt problems, and yields in such briny fields may be half what they would be if the water in them were fresh.

To induce the mutations, Dr Abe bombarded germinating seeds with carbon ions for 30 seconds. She then planted them in fields in Miyagi. Of 600 seeds that have undergone this treatment, 250 thrived and themselves produced healthy seeds.

Having fully salt tolerant rice would increase the world yield of rice by about 16%.

Penn Astrophysicists Zero In on Gravity Theory

By innovatively analyzing a well-studied class of stars in nearby galaxies, Jain and his colleagues — Vinu Vikram, Anna Cabre and Joseph Clampitt at Penn and Jeremy Sakstein at the University of Cambridge — have produced new findings that narrow down the possibilities of what this force could be. Their findings, published on the Arxiv, are a vindication of Einstein’s theory of gravity. Having survived a century of tests in the solar system, it has passed this new test in galaxies beyond our own as well.

Astrophysicists have been pursuing tests of gravity in the cosmos for many years, but conventional tests require data on millions of galaxies. Future observations are expected to provide such enormous datasets in the coming data. But Jain and his colleagues were able to bypass the conventional approach.

“We’ve been able to perform a powerful test using just 25 nearby galaxies that is more than a hundred times more stringent than standard cosmological tests,” Jain said.

The nearby galaxies are important because they contain stars called cepheids that are bright enough to be seen individually. Moreover, cepheids have been used for decades as a kind of interstellar yardstick because their brightness oscillates in a precise and predictable way.

“Now that we understand a little bit more about what makes the cepheids pulsate — a balance of gravity and pressure — we can use them to learn about gravity, not just distance,” Jain said. “If the fifth force enhances gravity even a little bit, it will make them pulsate faster.”

Because of their usefulness, there was already more than a decade of data on cepheids based on the Hubble Space Telescope and other large telescopes in Chile and Hawaii. Using that data, Jain and his colleagues compared nearly a thousand stars in 25 galaxies. This allowed them to make comparisons between galaxies that are theoretically “screened” or protected from the effects of the hypothetical fifth force and those that are not.

Jain and his colleagues ultimately did not see variation between their control sample of screened galaxies and their test sample of unscreened ones. Their results line up exactly with the prediction of Einstein’s general relativity. This means that the potential range and strength of the fifth force is severely constrained.

“We find consistency with Einstein’s theory of gravity and we sharply narrow the space available to these other theories. Many of these theories are now ruled out by the data,” Jain said.

With better data on nearby galaxies in the coming years, Jain expects that an entire class of gravity theories could essentially be eliminated. But there remains the exciting possibility that better data may reveal small deviations from Einstein’s gravity, one of the most famous scientific theories of all time.

Arxiv - Astrophysical Tests of Modified Gravity: Constraints from Distance Indicators in the Nearby Universe

Could an analogue computer simulate the human brain?

Below is an interview with Dr. Hava Siegelmann conducted by Sander Olson. Dr. Siegelmann is the head of the Biologically Inspired Neurological and Dynamical Systems (BINDS) lab at the University of Massachusetts. More information on her research can be found at the BINDS lab. Hava Siegelmann


Question: You oversee the Biologically Inspired Neural and Dynamical Systems Lab (BINDS) lab at the University of Massachusetts. What is the goal of that lab?

Answer:

I head the BINDS lab here in Amherst; we have two primary goals: The first is to obtain a computational understanding of natural systems, memory and learning in health and disease. The second is to create computational paradigms similar to the brain’s, to produce increasingly functional simulations of intelligence.

Spacex and Bigelow Aerospace join forces

Space Exploration Technologies (SpaceX) and Bigelow Aerospace (BA) have agreed to conduct a joint marketing effort focused on international customers. The two companies will offer rides on SpaceX’s Dragon spacecraft, using the Falcon launch vehicle to carry passengers to Bigelow habitats orbiting the earth. According to Bigelow Aerospace’s President and Founder, Robert T. Bigelow, “We’re very excited to be working with our colleagues at SpaceX to present the unique services that our two companies can offer to international clientele. We’re eager to join them overseas to discuss the substantial benefits that BA 330 leasing can offer in combination with SpaceX transportation capabilities”.

Nextbigfuture noted that the BA330 would be able to launched no the Spacex Falcon Heavy. The Falcon Heavy could almost launch a 2100 cubic meter Bigelow inflatable space station.



The BA 330 is a habitat that will provide roughly 330 cubic meters of usable volume and can support a crew of up to six. Bigelow Aerospace plans to connect two or more BA 330s in orbit to provide national space agencies, companies, and universities with unparalleled access to the microgravity environment.



Bigelow 2100 cubic meter inflatable space station model

Spacex Falcon Heavy


IEA reports global oil supply rose 600000 bpd in April, 2012 to a record 91 million bpd

Reuters - International Energy Agency (IEA) said global oil supply rose 600,000 barrels per day (bpd) to 91 million bpd in April and was now 3.9 million bpd over year ago levels, with 90 percent of the increase coming from OPEC.


Saudi Arabia has said it pumped 10.1 million bpd last month, its highest for more than 30 years, in a bid to meet growing demand and curb oil prices, which hit a three-and-a-half-year high in March.

May 10, 2012

Zero Hedge Clueless about Planetary Resources, Space and World Oil

There are various groups who in the past claimed that world crude oil production peaked in 2005 at about 72.75 million barrels per day.

Zero Hedge made that claim yesterday. This was in article written under the pseudonym Tyler Durden by Gregor Macdonald proudly proclaimed his cluelessness about Planetary Resources and how they will succeed.

Gregor MacDonald claims to be an oil analyst and energy sector investor, who also focuses on the coming transition to alternatives.

But first just correcting facts.

Crude oil production has been trapped below a ceiling since 2005.

The Energy Information Administration indicated that crude oil and lease condensate for January, 2012 reached a new peak at 75,581,112 million barrels per day. Total oil liquids is very near a peak at 88,813,089 million barrels per day. Only Dec, 2011 had just a bit more. The International Energy Administration oil market monthly has world oil production (all liquids) at over 89 million barrels per day and over 90.0 million barrels of oil per day. The IEA reports that April, 2012 world oil (all liquids) was at a record 91 million barrels per day.

Oil is a critical part of the world economy and Zero Hedge is supposed to be a site where you can get a deeper insight into how the "real" World Economy works. This mistake is big and fundamental mistake on world energy and the direction it is going. World oil supplies are going up. It is not a rapid rise but it is going up.

Planetary Resources is going for Space because they have the advantage in Space

Zero Hedge did not understand why Planetary Resources would go for Space minerals if there are plenty of minerals here on Earth.

Planetary Resources is making a space telescope that is about 100 times cheaper. They do not have that kind of cost advantage here on the Earth. In Space Adventures, the space tourism company, of two of the founders they have sold $500 million in trips to space. Why didn't they compete with Southwest airlines ? Because they had the unique advantage for space tourism.

Planetary Resources plans to put mass produced 20 kilogram space telescopes into low earth orbit starting at the end of 2013 Some have criticized Planetary Resources as something that will lose a lot of money. I contend that it will be highly profitable even before they mine anything.

It will use laser communication to transmit information back. The lens look like an 9 inch (22 centimeter) diameter telescope.

It uses star cameras for orientation

It uses reaction wheels to point itself. Use that basic stability and enhance it to subarc second pointing.

They can point to the earth and get 2 meter resolution of the ground.

Planetary Resources can choose to make themselves very profitable before any material is mined. Satellite imaging, space telescopes and space data sales are markets that will work.

Planetary Resources will be able to use passive and active arrays of telescopes to increase resolution and capture more of the Earth observing imaging market

They will be able to provide Google Earth with higher resolution images and frequent updates to generate more ad revenue for Google. Google founders are backing Planetary Resources.

Planetary Resources would be able to provide a lot more fairly good resolution and frequently updated satellite imagery. This could be advertising supported based on the traffic.

Planetary Resources will also be able to provide updated and fairly good resolution Google Moon, Google Space and eventually Google Asteroid.

Nanocomp Technologies will be supplying carbon nanotube yarn to replace copper in airplanes in 2014

Nanocomp Technologies (NTI) lightweight wiring, shielding, heating and composite structures enhance or replace heavier, more fatigue prone metal and composite elements to save hundreds of millions in fuel, while increasing structural, electrical and thermal performance. In INC Magazine, Nanocomp Technologies indicates that they will selling their carbon nanotube yarn (CTex) to airplane manufacturers in 2014 to replace copper wiring.

Nanocomp’s EMSHIELD sheet material was incorporated into the Juno spacecraft, launched on August 5, 2011, to provide protection against electrostatic discharge (ESD) as the spacecraft makes its way through space to Jupiter and is only one example of many anticipated program insertions for Nanocomp Technologies’ CNT materials.

In a recent Presidential Determination, Nanocomp’s CNT sheet and yarn material has been uniquely named to satisfy this critical gap, and the Company entered into a long-term lease on a 100,000 square foot, high-volume manufacturing facility in Merrimack, N.H., to meet projected production demand.

The U.S. Dept. of Defense recognizes that CNT materials are vital to several of its next generation platforms and components, including lightweight body and vehicle armor with superior strength, improved structural components for satellites and aircraft, enhanced shielding on a broad array of military systems from electromagnetic interference (EMI) and directed energy, and lightweight cable and wiring. The Company’s CTex™ CNT yarns and tapes, for example, can reduce the weight of aircraft wire and cable harnesses by as much as 50 percent, resulting in considerable operational cost savings, as well as provide other valuable attributes such as flame resistance and improved reliability.

Nanocomp Technologies, Inc., a developer of performance materials and component products from carbon nanotubes (CNTs), in 2011 announced they had been selected by the United States Government, under the Defense Production Act Title III program (“DPA Title III”), to supply CNT yarn and sheet material for the program needs of the Department of Defense, as well as to create a path toward commercialization for civilian industrial use. Nanocomp’s CNT yarn and sheet materials are currently featured within the advanced design programs of several critical DoD and NASA applications.
Pure carbon wires carry data and electricity, yarns provide strength and stability

NTI converts its CNT flow to pure carbon, lightweight wires and yarns with properties that rival copper in data conductivity with reduced weight, increased strength and no corrosion—NTI's wire and yarn products are presently being used both for data conduction and for structural wraps. For contrast: NTI's CNT yarns were tested against copper for fatigue; where copper broke after nearly 14,000 bends, NTI's CNT yarns lasted almost 2.5 million cycles—demonstrating nearly 2,000 times the fracture toughness.


NTI's CNT yarns can be used in an array of applications including: copper wire replacement for aerospace, aviation and automotive; structural yarns, reinforcing matrix for structural composites; antennas; and motor windings.



Free-Floating Planets in the Milky Way Outnumber Stars by Factors of Thousands: Life-Bearing Planets May Exist in Vast Numbers

Researchers say life-bearing planets may exist in vast numbers in the space between stars in the Milky Way. A few hundred trillion free-floating life-bearing Earth-sized planets may exist in the space between stars in the Milky Way. So argues an international team of scientists led by Professor Chandra Wickramasinghe, Director of the Buckingham Centre for Astrobiology at the University of Buckingham, UK.

The scientists have proposed that these life-bearing planets originated in the early Universe within a few million years of the Big Bang, and that they make up most of the so-called "missing mass" of galaxies. The scientists calculate that such a planetary body would cross the inner solar system every 25 million years on the average and during each transit, zodiacal dust, including a component of the solar system's living cells, becomes implanted at its surface. The free-floating planets would then have the added property of mixing the products of local biological evolution on a galaxy-wide scale.

Astrophysics and Space Science - Life-bearing primordial planets in the solar vicinity

The space density of life-bearing primordial planets in the solar vicinity may amount to ∼8.1×10^4 pc−3 (81000 per cubic parsec) giving total of ∼10^14 throughout the entire galactic disk. Initially dominated by H2 these planets are stripped of their hydrogen mantles when the ambient radiation temperature exceeds 3 K as they fall from the galactic halo to the mid-plane of the galaxy. The zodiacal cloud in our solar system encounters a primordial planet once every 26 My (on our estimate) thus intercepting an average mass of 103 tonnes of interplanetary dust on each occasion. If the dust included microbial material that originated on Earth and was scattered via impacts or cometary sublimation into the zodiacal cloud, this process offers a way by which evolved genes from Earth life could become dispersed through the galaxy.

Commercialization of multi kilowatt femotosecond lasers

Fraunhofer-Gesellschaft - Ultra-short laser pulses of outstanding high average püower are opening the doors to new applications in high throughput materials processing. Thanks to the short pulse duration, thermal damage of the material being processed is minimized.

For years, ultra-short laser pulses have been used for the extremely precise and gentle processing of highly-sensitive materials. Until now though, they have often lacked in power. The newly developed laser platform solves this problem with the INNOSLAB amplifier as its core. Four mirrors surround a laser crystal plate – the slab. Pump radiation enters at the two opposite faces of the slab. Ultra-short laser pulses are repeatedly reflected by these mirrors and pass through the slab several times. Energy is transfered from the pump radiation to the laser pulse until the required power is achieved.

To develop new markets for laser systems with ultra-short wavelengths, the team of developers had to increase the mean laser output of ultra-short pulse beam sources – up to several hundred watts. Higher power makes high volume production in industrie and shorter measuring times during scientific experiments possible.

Edgewave - EdgeWave started the development of industrial-suited over 1000Watt femotosecond-lasers on March 1, 2012. The project is being supported by the BMBF as a part of its funding initiative „Ultrakurzlaser für hochpräzise Bearbeitung“ and as part of the collaborative project „Femtosekundenlaser höchster Leistung (FOKUS)“. The collaborative project will be coordinated by EdgeWave, while the controlling agency is the VDI-TZ from Düsseldorf.

The focus of the project is the implementation of a reliable, compact and and cost effective and industrial suited fs-laser with an average power of over 1000Watt and a pulse duration in the range from 200fs to 1ps. For the processing of materials – e.g. fiber-reinforced plastics for lightweight designs – the beam sources of the aimed-at power class will enable a significant decrease in processing time.

Lately ultra-short-pulsed laser technology has become more and more economical not only in micro machining but also in macro machining of materials while in the medical sector ultra-short-pulsed lasers enable entirely new therapeutic approaches such as highly precise cuts of the eye with minimal damage. The fundamental feature of these laser beams are their extremely high peak intensities with low pulse energy enabling highly precise ablation processes and processing of temperature sensitive materials without causing thermal damage. In the production of LEDs or computer chips the yield per wafer will increase and in one of the most performed surigcal operations, the therapy of cataracts, new significantly more efficient and cheaper methods will be implementable. Those new therapeutical methods for age-related longsightedness will be rivaling the classical reading glasses in near future.

Ultra short pulse lasers

EdgeWave’s ultra short pulse lasers are diode-pumped and mode locked solid-state oscillators and amplifiers. The amplifiers are based on the unique INNOSLAB laser technology. Through an optimal combination of crystal shape, cooling and resonator design, ultra short pulse lasers with INNOSLAB amplifiers poss special qualities:

* Compact setup
* High efficiency and high amplification factor
* High beam quality
* Scalability for multi kW

Higher power, high repetition ultra short pulse lasers will help enable commercial nuclear fusion and high performance space propulsion.

In 2011, John Chapman of NASA proposed a pulsed laser system for megawatt class fusion propulsion.

In Chapman’s aneutronic fusion reactor scheme, a commercially available benchtop laser starts the reaction. A beam with energy on the order of 2 x 10^18 watts per square centimeter, pulse frequencies up to 75 megahertz, and wavelengths between 1 and 10 micrometers is aimed at a two-layer, 20-centimeter-diameter target.

The momentum of the energetic alpha particles provides clean, high ISP ~900,000 as vectored thrust such that p-11B offers a clean fuel with well understood reaction kinematics.



Foxconn builds Shangai Headquarters for operating and research hub

Focus Taiwan - Foxconn Technology Group, the main manufacturer for Apple Inc., is determined to grasp opportunities in the China market by setting up a one-stop business service for customers. Foxconn, also known as Hon Hai Group in Taiwan, broke ground Thursday on its China headquarters in Shanghai, as part of the group's bid to set up an operating hub and cutting-edge research and development center.

The headquarters, whose construction is scheduled to be completed in 2015, are expected to act as an e-commerce center in the Yangtze River Delta region for the group.

"With the new facilities, Foxconn is trying to integrate its services such as supply chain manufacturing, receiving and filing orders from customers and market expansion," Simon Yang, vice president of Topology Research Institute, told CNA.

Gou said he is confident that Foxconn's revenue will increase 10 percent in 2012 from last year.

According to Yang, the output value of China's domestic consumption market totaled 18.5 trillion Chinese yuan (US$2.9 trillion) last year, up 18 percent from 15.7 trillion Chinese yuan in 2010.

China Domestic Consumption with 15% growth
2011    US$2.9 trillion
2012    US$3.4 trillion
2013    US$3.9 trillion
2014    US$4.5 trillion
2015    US$5.2 trillion

US Navy wants ultraviolet stealth for jets

The US Navy wants a bidder to design, develop and demonstrate a UV obscuring material that may be dispersed from an aircraft. One concept might include a device that very rapidly generates an extended, dense cloud of material that absorbs in the UV region. Other concepts might include, but are not limited to, quantum dots or metamaterials that absorb in the UV and emit in the mid-IR. Final prototypes should be compatible with existing Navy aircraft expendable dispensing systems.

Wired Danger Room has coverage

The system should be compatible with the Navy’s existing counter-measures dispensers, which are currently tailored for releasing infrared flares and radar-foiling chaff to help warplanes dodge enemy missiles.

A UV cloak would complement the Navy’s other stealth initiatives. The F-35 Joint Strike Fighter, the product of history’s most expensive weapons program, is designed to scatter and absorb radar waves while also sinking its engine heat into its fuel load in order to make the plane less visible to infrared sensors. The Navy plans to purchase hundreds of carrier-compatible F-35s at more than $100 million a pop.

But the F-35′s design apparently does not protect against ultraviolet sensors — that we know of. The Navy’s older Hornet fighters are probably equally vulnerable. The UV cloak seems to be a response to a particular type of “dual-band” missile seeker that zeroes in on infrared radiation at first, then switches to a UV sensor in the final moments before striking the target. The UV sensor works by looking for non-reflective shadows against the bright UV glare of the sky — like silhouettes against a lightboard.

An obscurant could blot out a plane’s UV silhouette in a shapelesss mass of ultraviolet shadow. “One concept might include a device that very rapidly generates an extended, dense cloud of material that absorbs in the UV region,” the Navy solicitation reads. The solicitation also lists “quantum dots” (tiny radiation-emitting crystals) and man-made “metamaterials” as obscurant options.

The obscurant would probably work on helicopters, too.

Silicon Nanospheres Could Be Building Blocks Of Optical Invisibility Cloaks

Arxiv - Magnetic Light (24 pages)

Spherical silicon nanoparticles with sizes of a few hundreds of nanometers represent a unique optical system. According to theoretical predictions based on Mie theory they can exhibit strong magnetic resonances in the visible spectral range. The basic mechanism of excitation of such modes inside the nanoparticles is very similar to that of split-ring resonators, but with one important difference that silicon nanoparticles have much smaller losses and are able to shift the magnetic resonance wavelength down to visible frequencies. We experimentally demonstrate for the first time that these nanoparticles have strong magnetic dipole resonance, which can be continuously tuned throughout the whole visible spectrum varying particle size and visually observed by means of dark-field optical microscopy. These optical systems open up new perspectives for fabrication of low-loss optical metamaterials and nanophotonic devices.

We experimentally demonstrate for the first time that spherical silicon nanoparticles with sizes in the range from 100 nm to 200 nm have strong magnetic dipole response in the visible spectral range. The scattered “magnetic” light by these nanoparticles is so strong that it can be easily seen under a dark-field optical microscope. The wavelength of this magnetic resonance can be tuned throughout the whole visible spectral range from violet to red by just changing the nanoparticle size.

Technology Review - Invisibility cloaks that work for microwaves are easy to make using simple building blocks. Now engineers have created the equivalent building blocks for visible light.

Gamma-Ray Bending Opens New Door for Optics

Science Now - making a lens for highly energetic light known as gamma rays had been thought impossible. Now, physicists have created such a lens, and they believe it will open up a new field of gamma-ray optics for medical imaging, detecting illicit nuclear material, and getting rid of nuclear waste.



Bending the rules. Gamma ray lenses, which theory had suggested were impossible, could be made from heavy elements such as gold.Credit: Institut Laue–Langevin

For X-rays the real part of the refractive index, dominated by Rayleigh scattering, is negative and converges to zero for higher energies. For g rays a positive component, related to scattering, increases with energy and becomes dominating. The deflection of a monochromatic g beam due to refraction was measured by placing a Si wedge into a flat double crystal spectrometer. Data were obtained in an energy range from 0.18 - 2 MeV. The data are compared to theory, taking into account elastic and inelastic scattering as well as recent results on the energy dependence of the pair creation cross section. Probably a new field of g optics with many new applications opens up.

A megajoule gamma ray laser would enable nuclear fusion.

Toward Regenerative Medicine Against Atherosclerosis cardiovascular disease

SENS Foundation-funded research shows that expression of a modified microbial enzyme protects human cells against 7-ketocholesterol toxicity, advancing research toward remediation of the foam cell and rejuvenation of the atherosclerotic artery.

Atherosclerotic cardiovascular disease is the principal cause of ischaemic heart disease, cerebrovascular disease, and peripheral vascular disease, making it the root of the leading cause of morbidity and mortality worldwide. Atherosclerosis begins with the entrapment and oxidation of low-density lipoprotein (LDL) cholesterol in the arterial endothelium. As a protective response, the endothelium recruits blood monocytes into the arterial wall, which differentiate and mature into active macrophages and engulf toxic oxidized cholesterol products (oxysterols) such as 7-ketocholesterol (7-KC). Although initially protective, this response ultimately leads to atherosclerotic plaque: oxidized cholesterol products accumulate in the macrophage lysosome, and impair the processing and trafficking of native cholesterol and other materials, leading macrophages to become dysfunctional and immobilized in the arterial intima. With ongoing entrapment of oxidized LDL in the tunica intima, more and more of these disabled "foam cells" progressively accumulate in the arterial wall, generating the fatty streaks that form the basis of the atherosclerotic lesion.

The results, although preliminary, are clearly promising for the potential of lysosomally-targeted DS1 cholesterol oxidase to remediate the oxysterol-intoxicated foam cell, and thus to potentially reverse the root cause of atherosclerosis. They also include some surprises that will need to be probed with further research. Why, for instance, was cell viability improved by DS1 ChOx/LAMP1 fusion protein even in cells that had not been treated with 7-KC (Fig. 1)? One possibility is that in addition to exogenous 7-KC, the enzyme degrades some basal level of oxysterol metabolites that are present even in untreated cells, or that it detoxifies other toxic lysosomal substrates. Another possibility, suggested in personal communication by J Mathieu, is that some metabolite of DS1 ChOx activity might actually provide a benefit to the cell. If such effects also emerge in vivo in any ultimate therapeutic use of suitably-modified DS1 ChOx, they might ultimately provide unexpected additional benefits to the functioning of the aging macrophage lysosome.

School of Dentistry Invents Dental Fillings That Kill Bacteria and Remineralize the Tooth

Scientists using nanotechology at the University of Maryland School of Dentistry have created the first cavity-filling composite that kills harmful bacteria and regenerates tooth structure lost to bacterial decay.

Rather than just limiting decay with conventional fillings, the new composite is a revolutionary dental weapon to control harmful bacteria, which co-exist in the natural colony of microorganisms in the mouth, says professor Huakun (Hockin) Xu, PhD, MS

IEC Bussard Fusion Project gets two more years of funding

The Navy is funding EMC2 an additional $5.3 million over next 2 years to work on the problem of pumping electrons into the Polywell. Big new pulsed power supply to support the electron guns (100+A, 10kV). WB-8 has been operating at 0.8 Tesla (8 times stronger magnetic field than any previous version).

There was a review done of the work and the recommendations were to continue and expand the effort.

(H/T To Talk Polywell)


Back in late 2010 and early 2011 they had indicated the WB8 would be done by now. Below are the old quotes. they show the slippage of two years to work on the electron injection problems.

* The WB-8 isn’t intended to be a net power machine. What it is intended to do is thoroughly test the scaling properties of the Polywell design to see if a net power reactor is practicable.

* EMC2 will only be testing D+D fusion in the WB-8. However, if the results are promising enough, they have an option to extend their contract to test p+B11 fusion. If they do, the new device (WB-8.1) will be due by Oct. 31, 2011, and the final report will be due a year later.

* If either the WB-8 or the WB-8.1 produces promising results, EMC2 will go on to build the WB-D, a 100 MW demonstration reactor.

So WB-8 is promising but has some issues, which they are hoping to work out in 18-24 months.

Pocket-sized fuel cell charges phones for two weeks

Brookstone will be the first retail launch partner for Lilliputian Systems Inc. (LSI’s) portable charging system. Brookstone will be responsible for the marketing, promotion, distribution and sale of the product through their various distribution channels such as catalog, Brookstone.com, and retail stores including airport and mall locations. Lilliputian will be responsible for the product design, development, and manufacturing. The product will be branded and sold under the Brookstone® brand.CNET - Fuel cell maker Lilliputian Systems today announced that Brookstone will be the first retailer to carry its portable USB power source, which will be sold under Brookstone's brand. The fuel cell device is about the size of a thick smartphone, and the lighter fluid-filled cartridges are about the same size as a cigarette lighter.

Lilliputian’s patented Silicon Power Cell™ technology, originally developed at the world renowned Massachusetts Institute of Technology (“MIT”) Microsystems Technology Laboratory (“MTL”), includes a chip based power generator and is fueled by recyclable high energy fuel cartridges. The technology is reliable, safe (approved for use on aircraft) and environmentally friendly (6x more efficient/lower carbon footprint than using a wall charger). When compared to Lithium-Ion battery alternatives, Lilliputian’s solution provides a 5—10x improvement in volumetric energy density (energy density by volume) and 20—40x improvement in gravimetric energy density (energy density by weight) at a fraction of the cost.

Juice in a box. Lilliputian's portable fuel cell can deliver between 10 to 14 full charges for an iPhone with one replaceable cartridge. (Credit: Lilliputian Systems)


May 09, 2012

Companies, Universities and Government Departments completely ignore Room Temperature Superconductivity Work

Joe Eck at Superconductors.org has performed what appears to be detailed work to identify a room temperature superconducting transition.

The hallmark of superconductivity is a sudden resistance drop to zero ohms and strong diamagnetism (the Meissner effect) near the same temperature. In numerous tests a small amount of the compound (Tl5Pb2)Ba2Mg2Cu9O17+ consistently produced sharp resistive transitions near 28.5 Celsius (see above graphics), and diamagnetic transitions also near 28.5C (see below)1. The transitions were unambiguous, repeatable, and at ambient pressure, making this the first observation of true room-temperature superconductivity in a copper-oxide. Unfortunately, like the 18C superconductor discovered in March 2011, these transitions occurred in a noisy environment, suggesting the volume fraction is very low. As such, any plans for immediate commercialization will have to wait for a refinement method to be developed.

Since the volume fraction of the 28C compound was low, the first step in commercialization was to find a governmental or industrial partner to help develop a refining and manufacturing technology. However, in the past 5 months a concerted effort to find a partner - or even a university to vet the discovery - has yielded nothing but apathy.

Since scientists have been searching for room temperature superconductivity for most of the last 100 years, this lack of interest is inexplicable. Such pervasive apathy is totally inconsistent with the U.S.' role as a technology leader. This may be the first time in American history that a discovery of profound potential has been completely stifled by equally profound indifference.

There have been journal articles of Joe Ecks earlier work, but they did not credit Joe for it.

Eagle Ford producing more than 500,000 barrels per day of oil in April 2012

1. Petroleum News - The number of new wells drilled in Texas’ Eagle Ford shale more than doubled during the first three months of 2012, compared with the same period a year ago, according to Bentek Energy analysis. Daily increase in activity ratcheted up production of oil and other liquids, from 182,000 barrels per day in April 2011 to more than 500,000 bpd in April, 2012, according to Bentek’s analysis. The Eagle Ford also produces about 2 billion cubic feet of natural gas per day.

Texas added 50,000 barrels per day from January to February, 2012

There are projections for Texas oil production to get over 2 million barrels per day (based on the Texas railway commission definitions). The Texas commissioner in charge of their oil industry is saying that 4 million barrels per day of oil production in Texas is feasible by 2016.

2. In May 2011, pipeline companies committed more than $1 billion to add 940,000 barrels per day (bpd) of pipeline capacity (to support increase in Eagle Ford oil production in Texas) by the end of 2012.

Where did the future go-- The Strategy of Technology and The Space Race, McNamara and LBJ and the Lost Future of 2001.



Warning: This is a longer (6100 word) article giving detailed Cold War history.
==========================================
Joseph Friedlander here in a guest article for Next Big Future.

 


Where did the future go? To a young boy born around 1935, seeing the war end with atomic armed B-29s over Japan, watching the movie Destination Moon in 1950, reading the Colliers articles in 1952,

UPDATE- Jerry Pournelle linked to this article from his Chaos Manor site. He said
"There is a long bit on McNamara and the Strategy of Technology http://nextbigfuture.com/2012/05/where-did-future-go-strategy-of.html which will be worth the attention of those interested in those subjects."

Here is my (Brian Wang) summary of this article. Robert McNamara killed X-plane experimentation. Technological development needs a lot of trial and error with rapid build and test and modify iterations. By removing rapid development cycles the cost of technology has increased and the pace of technology has slowed. This has become a fundamental flaw in many US technology development programs. There is a lot of links and extracts from Freeman Dyson and Jerry Pournelle about the flaws in Technology Development policy which are traced back to what McNamara did.


and who heard the beep of Sputnik rebroadcast on the radio as he graduated college in 1957, the future was obvious—a race for the commanding heights of space, possibly with atomic survival as a prize. These were very paranoid times, and men driven by fear will work very hard and spend a lot to develop any weapons they need.

 Every rocket nut knew that an upgraded V-2 launched from a future base on the Moon could deliver an (lightweight) atomic bomb to Earth—and many tech races were on simultaneously, to make rockets heavier, warheads lighter, planes faster, nuclear reactors more powerful, submarines deeper diving.  By 1958 instead of 1944’s 1-ton chemical explosive warhead, or 1952’s lightweight 1 ton 20 kiloton atomic bombs—by 1958 a megaton warhead was possible within the weight constraints of the V-2.  Had Scuds—Soviet evolutions of V-2s—been based upon the Moon, they could have shot such megaton warheads to Earth on a science-fiction style attack on the USA.
 Lyndon Baines Johnson, later President, said that he for one did not want to go to bed by the light of a Communist moon.

It was an arms race, with a Eurasian continental power (The Soviet Union) pitted against an oceanic/air power (The USA) that truly did not want to get into large-scale land combat in Eurasia.  And it had one of the rudest awakenings in a short time in the history of the great powers—in June 1949, America had an atomic monopoly, was on top of the world.  By September of that year, the Russians had an atomic bomb, China went communist and America basically went into shock. By January 1950 development of the hydrogen bomb was authorized, by June 1950 Communist North Korea invaded US trooped South Korea, by October 1950 the US was in effect at war with Communist China, in December 1950 President Truman was making veiled nuclear threats, and by June 1951 the country was fully expecting an atomic war as a real possibility—just two years after the last days of the atomic monopoly.  From my readings of history I think that was perhaps even a greater shock than the later Sputnik shock in October 1957, and it possibly explains the reaction to Sputnik—the USA was alarmed about surprises in the level of Soviet weapons building capability.

China plans to have a 5 megawatt Liquid Fluoride Thorium Reactor in 2015

World Nuclear Association reports on China Nuclear Fuel cycle work

The China Academy of Sciences in January 2011 launched a program of R&D on thorium-breeding molten-salt reactors (Th-MSR or TMSR), otherwise known as Liquid Fluoride Thorium Reactor (LFTR), claiming to have the world's largest national effort on these and hoping to obtain full intellectual property rights on the technology. A 5 MWe MSR is apparently under construction at Shanghai Institute of Applied Physics (under the Academy) with 2015 target operation.

Also Traveling Wave reactor work
CGNPC and Xiamen University are reported to be cooperating on R&D for the traveling-wave reactor. The Ministry of Science & Technology, with CNNC and SNPTC, are skeptical of it. (This is a fast reactor design using natural or depleted uranium packed inside hundreds of hexagonal pillars. In a “wave” that moves through the core at only one centimetre per year, the U-238 is bred progressively into Pu-239, which is the actual fuel. However, this design has now radically changed.)


Tom Mahood describes working on Mach Effect Propulsion and a description for Layman

Tom Mahood worked with Jim Woodward for over ten years on Mach Effect Propulsion. Tom describes his graduate studies work and the Mach Effect and working with Jim Woodward. (H/T Talk Polywell)

I eventually ended up working with Jim for quite some time, focusing my graduate work on gravitation and Jim becoming my graduate adviser. Before doing so, I discussed it with the Physics department graduate adviser. I asked him flat out, “Do you think Jim is crazy?” He laughed and said, “No, not at all”. He said that at the very least I’d get an excellent education in experimental technique. But as we talked, it became clear to me the adviser didn’t exactly know what Jim was doing and hadn’t read any of Jim’s papers.

Fun with Inertia

It would probably be a good idea to go over a little of the theory before getting into the guts of what I got myself involved with. It essentially revolves around what is the cause of inertia. Why, when you push on something, does it resist? Toward the end of the 1800s, a physicist by the name of Ernest Mach (of “Mach number” fame) suggested that inertia was caused by the interaction of all the matter in the universe. Einstein later gave this idea the name of ‘Mach’s Principle”. It’s a tantalizing theory, but it’s never been clearly proven.

OK, time for a classic thought experiment! Suppose you take a bucket partially filled with water, and start spinning the water in it. As the water spins around the bucket, it rises up the sides due to centrifugal forces. You see the same thing every time you make a Margarita in a blender. Nothing strange there. Now let’s bring Einstein’s Relativity into play. It says that all motion is relative and you get the same results whether you smash two cars together head on at 30 miles per hour as you would if one car was stationary and you hit it with another car at 60 miles per hour. In either case the cars close at 60 mph. Again, nothing strange there, just common sense.

But now lets go back to the water spinning in the bucket. According to Relativity (which has yet to be disproved), you would get the same results (i.e., water rising up the sides) if you held the bucket still and spun the universe in circles around it. Whoaaaa! Now that’s pretty weird! If there’s no link between all the matter in the universe and the bucket’s water, how could that happen? The only possible way the rest of the universe, spinning madly around the bucket of water can possibly affect it is through some sort of gravitational interaction.

"Spinning water buckets are one thing, but how does some asteroid around Alpha Centuri affect the inertia of my Toyota?," you might be asking. Fair and interesting question. It turns out that every piece of matter in the universe creates its own little bit of a gravitational field. The value of this field at a distance is called the "gravitational potential". Taken by themselves, these little bits and pieces of matter around us don't amount to a whole lot, gravitationally speaking. Look how much matter you need in one place (i.e., the Earth) before anything interesting happens. And even then by simply jumping you can temporarily break the Earth's grip. Furthermore, this
gravitational potential diminishes with distance, which is why our much larger Sun doesn't pull us off the surface of the Earth to a toasty doom. The much closer (though smaller) Earth wins the tug-of-war.

There is a similar situation with electrical charges worth mentioning. As you may be aware, things can have certain electrical charges. Materials can be positively charged, negatively charged, or have no charge. Things that end up having a positive charge simply have more positively charged "bits" than negatively charged bits. Things that have no charge, or neutral materials, have essentially the same number of positively charged bits as negative bits. The positive charges cancel out the negative charges. And neutral materials make up the vast majority of stuff in our universe, so when we step way back and look at our whole universe, the overall electrical potential is zero.

But this isn't the case with the gravitational potential, because there isn't "positive gravity" or "negative gravity" to cancel out each other. There's just one flavor of gravity and it adds and adds and adds and ..... Well, you get the picture. If every piece of matter in the universe is generating its own little bit of gravitational potential, pretty soon you end up with a huge amount of this gravitational potential everywhere.

But why don't we feel any of this gravitational potential if it's so great? Well, we do and we don't. In a sense, it's like living in a highly pressurized underwater habitat. Even though the occupants of the habitat might be under tremendous pressure, they don't really notice it because they experience the same pressure everywhere around them, even inside them. It's the same with gravitational potential. Even though it may have a very large value, as long as it's pretty much equal everywhere, we don't notice anything out of the ordinary. When we notice gravitational effects (e.g., falling down the stairs), what we're actually noticing are differences in the gravitational potential (the "gradient" in nerd-speak). In the case of falling down the stairs, it's the nearby Earth causing a gradient in the gravitational potential. But when it's the same potential everywhere, you don't feel it.

But I also said we do feel it. We "feel" it every time we push on something, and that something pushes back. It's the interaction of this universal gravitational potential with matter that causes inertia! The gravitational potential sort of "oozes" through all matter (because you can't shield gravity) and gives it that resistance to being shoved around we call inertia.

After a lot of years of work, Jim had found a quirky mathematical derivation that suggested by rapidly changing the energy density of an object in a certain way, it might be possible to briefly alter its mass. In most cases, it would time average to zero, in that briefly the mass would increase, then decrease, and it would always just cancel itself out. But it appeared possible, mathematically at least, that one could fool Mother Nature and extract a net force or thrust on the object changing mass by pushing or pulling it at just the right times.

May 08, 2012

FDA Considers a pill for HIV Prevention which can reduce infection by 75%

Federal drug regulators on Tuesday affirmed landmark study results showing that a popular HIV-fighting pill can also help healthy people avoid contracting the virus that causes AIDS in the first place. While the pill appears safe and effective for prevention, scientists stressed that it only works when taken on a daily basis.

The Food and Drug Administration will hold a meeting Thursday to discuss whether Truvada should be approved for people who are at risks of contracting HIV through sexual intercourse. The agency's positive review posted Tuesday suggests the daily pill will become the first drug approved to prevent HIV infection in high-risk patients.

FDA reviewers conclude that taking Truvada pre-emptively could spare patients "infection with a serious and life-threatening illness that requires lifelong treatment."


Truvada is made by Gilead

Progress to Practical Spintronics via Graphene Nanoribbons

A team of physicists from the University of South Florida and the University of Kentucky have taken a big step toward the development of practical spintronics devices, a technology that could help create faster, smaller and more versatile electronic devices.

Lisenkov said an important step toward fabrication of the “holy grail” of spintronics is finding a semiconductor that has a net 'spin' at room temperature. The biggest challenge, however, is how to set the spin and in what material.

The USF-Kentucky team showed that a simple combination of metal atoms and a flat sheet of one atom- thick layer of pure carbon called graphene can be suitably engineered and used for this purpose.

Graphene is a relatively tangible material that can be made by peeling ordinary graphite (the same material in lead pencils) with common transparent tape. Graphene boasts properties such as a breaking strength 200 times greater than steel. It is of great interest to the semiconductor and data storage industries, electric currents that can blaze through it 100 times faster than in silicon.

Spintronic devices are hotly pursued because they promise to be smaller, more versatile, and much faster than today's electronics and use less energy.

Physical Review Letters - Magnetic Anisotropy and Engineering of Magnetic Behavior of the Edges in Co Embedded Graphene Nanoribbons

Deep Brain Stimulation May Hold Promise for Mild Alzheimer's Disease

A study on a handful of people with suspected mild Alzheimer’s disease (AD) suggests that a device that sends continuous electrical impulses to specific “memory” regions of the brain appears to increase neuronal activity. Results of the study using deep brain stimulation, a therapy already used in some patients with Parkinson’s disease and depression, may offer hope for at least some with AD, an intractable disease with no cure.

(H/T Kurzweil AI)

“While our study was designed mainly to establish safety, involved only six people and needs to be replicated on a larger scale, we don’t have another treatment for AD at present that shows such promising effects on brain function,” said the study’s first author, Gwenn Smith, Ph.D., a professor in the Department of Psychiatry and Behavioral Sciences at the Johns Hopkins University School of Medicine. The research, published in the Archives of Neurology, was conducted while Smith was on the faculty at the University of Toronto, and will be continuing at Toronto, Hopkins and other U.S. sites in the future. The study was led by Andres M. Lozano, chairman of the Department of Neurosurgery at the University of Toronto.

One month and one year after implanting a device that allows for continuous electrical impulses to the brain, Smith and her colleagues performed PET scans that detect changes in brain cells’ metabolism of glucose, and found that patients with mild forms of AD showed sustained increases in glucose metabolism, an indicator of neuronal activity. The increases, the researchers say, were larger than those found in patients who have taken the drugs currently marketed to fight AD progression. Other imaging studies have shown that a decrease in glucose metabolism over the course of a year is typical in AD. Alzheimer’s disease cannot be precisely diagnosed by brain biopsies until after death.

The team observed roughly 15 percent to 20 percent increases in glucose metabolism after one year of continuous stimulation.

The researchers — most with the University of Toronto ­— reported few side effects in the six subjects they tested. Just as importantly, says Smith, was seeing that DBS appeared to reverse the downturn in brain metabolism that typically comes with AD.

AD is a progressive and lethal dementia that mostly strikes the elderly. It affects memory, thinking and behavior. Estimates vary, but experts suggest that as many as 5.1 million Americans may have AD and that, as baby boomers age, prevalence will skyrocket. Smith says decades of research have yet to lead to clear understanding of its causes or to successful treatments that stop progression.

Archives of Neurology - Increased Cerebral Metabolism After 1 Year of Deep Brain Stimulation in Alzheimer Disease

First light from a super-Earth spotted

Scientists on a planetary-heat-seeking mission have detected the first infrared light from a super-Earth — in this case, a planet some 40 light-years away. And according to their calculations, 55 Cancri e, a planet just over twice the size of Earth, is throwing off some serious heat.

At a toasty 3,700 degrees Fahrenheit, the planet is hot enough to liquefy steel. And there’s not much relief from the scorching heat: Researchers at MIT and other institutions say the planet may lack reflective surfaces such as ice caps, instead absorbing most of the heat from its parent star — much as Earth’s dark oceans trap heat from the sun.


Data from the Spitzer Space Telescope reveals that 55 Cancri e is very dark, and that its sun-facing side is blistering hot. Image: NASA/JPL-Caltech

Greece appears unlikely to form government and heading to new elections and policy stalemate

Christian Science Monitor - Tsipras's party came in second Sunday, winning 52 of parliament's 300 seats with 16.8 percent of the vote. He has the presidential mandate to end the political impasse by forming a governing coalition by Thursday.

UPDATE- Forbes - the risk of Greece leaving the Eurozone has risen to 75%

An ECB board member, Fitch Ratings’ CEO, and the head of an important hedge fund have all publicly accepted a Greek exit as possible, while Citi reportedly raised its probability that the Hellenic Republic leaves the Eurozone by 2013 to 75%.

Antonis Samaras, head of the winning conservative party that has 108 seats, gave up on the same task after just a few hours Monday when Tsipras spurned his advances.

Tsipras urged Samaras and third-placed Socialist leader Evangelos Venizelos to renege on their support for the bailout commitments, asking them to "honestly repent for their disastrous choices that tore our society apart."

Greece has promised to pass new austerity measures worth €14.5 billion ($18.9 billion) next month and to implement other swift reforms. These will promptly be reviewed by its creditors, who will then decide whether to release or withhold the next batch of bailout funds.

But Samaras quickly blasted Tsipras' proposal as "unbelievably arrogant," warning it would "drag the country into chaos" and see it expelled from the eurozone.

Analysts suggested that the eurozone and IMF could give Athens a minimal lifeline of credit while Greece sorts out its political impasse or holds new elections.

DARPA plans chip implants to monitor the health of soldiers

Mobiledia - Defense Advanced Research Projects Agency (DARPA) announced plans to create nanosensors that monitor soldiers' health on the battlefield and keep doctors constantly abreast about potential health problems.

DARPA called the implants "a truly disruptive innovation," highlighting how healthier soldiers would change the state of modern warfare because most medical evacuations occur due to ordinary illnesses and disease, not injuries. If the U.S. can lead the way in this kind of high-tech monitoring, it could give the military another leg up on adversaries still beset by everyday illness.

This first announcement focuses on creating nanoparticles capable of diagnosing diseases, but DARPA expects to launch a second effort focused on treatment in late 2012. Once it gathers proposals from private companies and academic researchers, it can begin moving forward with animal trials that might eventually lead to human clinical trials.

At the 2012 International Solid-State Circuits Conference (ISSCC) Stanford electrical engineer Poon demonstrated a tiny, wirelessly powered, self-propelled medical device capable of controlled motion through a fluid—blood more specifically. The era of swallow-the-surgeon medical care may no longer be the stuff of science fiction.



Poon is an assistant professor at the Stanford School of Engineering. She is developing a new class of medical devices that can be implanted or injected into the human body and powered wirelessly using electromagnetic radio waves. No batteries to wear out. No cables to provide power.

“Such devices could revolutionize medical technology,” said Poon. “Applications include everything from diagnostics to minimally invasive surgeries.”

Certain of these new devices, like heart probes, chemical and pressure sensors, cochlear implants, pacemakers, and drug pumps, would be stationary within the body. Others, like Poon’s most recent creations, could travel through the bloodstream to deliver drugs, perform analyses, and perhaps even zap blood clots or removing plaque from sclerotic arteries.

Google gets a special public roads license for its self driving car in Nevada

PC Mag - Google has become the first company to receive a license to test its self-driving car on public roads, the Nevada DMV said Monday. The license plate that will be used to officially designate the autonomous vehicle will have a red background, with an infinity symbol on the left-hand side.

The Nevada DMV said that Google and DMV officials had tested the self-driving cars along freeways, state highways and neighborhoods both in Carson City and the busy Las Vegas Strip. The newly-formed Autonomous Review Committee then met to review Google's safety plans, employee training, system functions and accident reporting mechanisms.

Brad Templeton has a list of robotic car projects

New class of metameterials are Metafluids for Transformation Acoustics

A research team lead by Professor Martin Wegener at the Karlsruhe Institute of Technology (KIT) has succeeded in realizing a new material class through the manufacturing of a stable crystalline metafluid, a pentamode metamaterial. Using new nanostructuring methods, these materials can now be realized for the first time with any conceivable mechanical properties.

Eventually, numerous three-dimensional transformation acoustics ideas, for example inaudibility cloaks, acoustic prisms or new loudspeaker concepts, could become reality in the near future.


Pentamode metamaterials almost behave like fluids. Their manufacture opens new possibilities in transformation acoustics. (Source: CFN, KIT)

Applied Physics Letters - On the practicability of pentamode mechanical metamaterials

Conceptually, all conceivable three-dimensional mechanical materials can be built from pentamode materials. Pentamodes also enable to implement three-dimensional transformation elastodynamics—the analogue of transformation optics. However, pentamodes have not been realized experimentally. Here, we investigate inasmuch the pentamode theoretical ideal suggested by Milton and Cherkaev in 1995 can be approximated by a metamaterial with current state-of-the-art lithography. Using numerical calculations calibrated by our fabricated three-dimensional microstructures, we find that the figure of merit, i.e., the ratio of bulk modulus to shear modulus, can realistically be made as large as about 1000.

64 exaflop limit to current computing paradigm

HPCWire - Thomas Sterling, Professor of Informatics & Computing at Indiana University, takes us through some of the most critical developments in high performance computing, explaining why the transition to exascale is going to be very different than the ones in the past and how the United States is losing its leadership in HPC innovation.

Exascale is also different because unlike previous milestones, it is unlikely that we will face yet another one in the future. These words may be thrown back in my face, but I think we will never reach zettaflops, at least not by doing discrete floating point operations. We are reaching the anvil of the technology S-curve and will be approaching an asymptote of single program performance due to a combination of factors including atomic granularity at nanoscale.

A new execution model as an embodiment of a paradigm shift will drive this transition from old systems to new. We have done this before in the case of scalar to vector and SIMD, and again from these to message passing, MPPs, and clusters. We are now simply -- or not so simply -- facing another phase shift in HPC system programming, structure, and operation.

Of course I anticipate something else will be devised that is beyond my imagination, perhaps something akin to quantum computing, metaphoric computing, or biological computing. But whatever it is, it won’t be what we’ve been doing for the last seven decades. That is another unique aspect of the exascale milestone and activity. For a number, I’m guessing about 64 exaflops to be the limit, depending on the amount of pain we are prepared to tolerate.

Thomas Sterling will be delivering the Wednesday keynote at this year's International Supercomputing Conference (ISC'12), which will take place in Hamburg, Germany from June 17-21. His presentation will examine the achievements over the past 12 months in high performance computing.

Some believe that low power onchip photonic communication combined with memristor memory and processing could be enough to get to zettaflop supercomputers.


May 07, 2012

Power Generation Technology Based On Piezoelectric Nanocomposite Materials Developed By KAIST

Nanopatents and innovations - the team of Professor Keon Jae Lee (http://fand.kaist.ac.kr/) from the Department of Materials Science and Engineering, KAIST, has developed new forms of low cost, large-area nanogenerator technology using the piezoelectric ceramic nanoparticles.

Piezoelectric effects-based nanogenerator technology that converts existing sources of nonpolluting energies, such as vibrational and mechanical energy from the nature of wind and waves, into infinite electrical energy is drawing immense interest in the next-generation energy harvesting technology. However, previous nanogenerator technologies have limitations such as complicated process, high-cost, and size-related restrictions.

Recently, Professor Lee's research team has developed a nanocomposite-based nanogenerator that successfully overcomes the critical restrictions existed in previous nanogenerators and builds a simple, low-cost, and large-scale self-powered energy system. The team produced a piezoelectric nanocomposite by mixing piezoelectric nanoparticles with carbon-based nanomaterials (carbon nanotubes and reduced graphene oxide) in a polydimethylsiloxane (PDMS) matrix and fabricated the nanocomposite generator by the simple process of spin-casting or bar-coating method.

Professor Zhong Lin Wang from Georgia Institute of Technology, who is the inventor of the nanogenerator, said, "This exciting result first introduces a nanocomposite material into the self-powered energy system, and therefore it can expand the feasibility of nanogenerator in consumer electronics, ubiquitous sensor networks, and wearable clothes."

New paper made of graphene and protein fibrils

ETH Zurich - Researchers led by Raffaele Mezzenga, a professor in Food and Soft Materials Science, have created a new nanocomposite made of graphene and protein fibrils: a special paper, which combines the best features of both components.

The circular sheets that Raffaele Mezzenga gently lifts from a petri dish are shiny and black. Looking at this tiny piece of paper, one could hardly imagine that it consists of a novel nanocomposite material, with some unprecedented and unique properties, developed in the laboratory of the ETH professor.

This new "paper" is made of alternating layers of protein and graphene. The two components can be mixed in varying compositions, brought into solution, and dried into thin sheets through a vacuum filter - "similarly as one usually does in the manufacture of normal paper from cellulose" says Mezzenga. "This combination of different materials with uncommon properties produces a novel nanocomposite with some major benefits," says the ETH professor. For example, the material is entirely biodegradable.


The final hybrid nanocomposite paper made of protein fibrils and graphene after vacuum filtration drying. The schematic route used by the researchers to combine graphene and protein fibrils into the new hybrid nanocomposite paper. (Reproduced from Li et al. Nature Nanotechnology 2012)

Nature Nanotechnology - Biodegradable nanocomposites of amyloid fibrils and graphene with shape-memory and enzyme-sensing properties

Carnival of Space 248

The Carnival of Space 248 is up at Dear Astronomer.


Centauri Dreams looks at the 'Advent of the Belters,' relating the news from Planetary Resources to older dreams of mining the asteroids.

Coronal Mass Ejection Risk - One in Eight of a 1859 Event

LA Times - Coronal mass ejections are caused when the magnetic field in the sun's atmosphere gets disrupted and then the plasma, the sun's hot ionized gas, erupts and send charged particles into space. Think of it like a hurricane — is it headed toward us or not headed toward us? If we're lucky, it misses us. In 1989 on hit Quebec, the power system went from normal operation to failure in 90 seconds. It affected around 6 million people. The impact was reckoned to be $2 billion Canadian in 1989 prices.

A storm in 1859 as the biggest space weather event. We know there were huge impacts on the telegraph, which suggests there would be similarly severe impacts on modern power grids. It's hard to compare it to the 1989 event because of the changes in our technology.

A recent paper [published in February in the journal Space Weather] tried to estimate the chance of having a repeat of 1859 and came up with a value of a 12% chance of it happening in the next 10 years. That's quite a high risk.

China has an aging population and gender imbalance but not as bad as many fear

the American Enterprise Institute has a working paper - World Population Prospects and the Global Economic Outlook: The Shape of Things to Come (43 pages)

The Census Bureau predicts that China’s population will peak in 2026, just 14 years from now. Its labor force will shrink, and its over-65 population will more than double over the next 20 years, from 115 million to 240 million. It will age very rapidly. Only Japan has aged faster -- and Japan had the great advantage of growing rich before it grew old.

Three million hidden births per year

The Telegraph UK and other sources are reporting that 3 million babies are hidden in China each year. This is according to research by Liang Zhongtang, a demographer and former member of the expert committee of China's National Population and Family Planning Commission.

Since 1978, China’s government has limited each couple to one child in a bid to stem the growth of the world's largest population. To police the law, neighbourhood committees keep a close eye out for any pregnancies, and Family Planning officials have the power to force women to have abortions and sterilisations, as well as to monitor their contraception.

The policy does not apply to everyone. In the countryside, parents are allowed to try for a second child if their first is a girl. Couples who are both single children themselves are also allowed to have two children. A growing number of rich Chinese also pay fines in order to have a second child.

Examining China’s census figures, Mr Liang came across discrepancies that proved the subterfuge. “In 1990, the national census recorded 23 million births. But by the 2000 census, there were 26 million ten-year-old children, an increase of three million,” he said. "Normally, you would expect there to be fewer ten-year-olds than newborns, because of infant mortality," he added.

His findings suggest that the one-child policy may not have the grim consequences that have been widely predicted. According to China’s own figures, the traditional desire among Chinese families to have a boy, coupled with the one-child regime, should produce a surfeit of 30 million men by 2020, with many parents allegedly using ultrasound to guarantee the sex of their child.

Mr Liang said the imbalance was “definitely not as severe as the statistics suggest”. Instead of aborting female foetuses, Mr Liang's research suggests that the families have the girls, but do not declare them. The families wait until they are six or seven and by then, the local governments tend not to care as much.

60 million hidden births over 20 years puts China's working age population into a small gain for 2030.