Pages

September 28, 2013

Roadmap to a Fusion-Driven Rocket with a 90 day trip from Earth to Mars

NASA Future in Space Operations - 90-Day Single-Launch to Mars: A Case Study for The Fusion-Driven Rocket

John Slough and his team are working to experimentally prove out his direct fusion propulsion system. They hope to show about 1.6 times more power out than power in before 2015 while on their NASA NIAC grants. The plan is to scale up to 200 times gain by 2030.


Lightweighting cars with carbon fiber, aluminum, titanium and magnesium

At the beginning of this year, China's first self-dumping truck equipped with a carriage that is made of carbon fiber composite material was completed. The truck is 8.6 meters long, with a load capacity of 50 tons and a curb weight of 4.8 tons, 29 percent lighter than a truck with a metal carriage. This marked the first successful use of composite material in the carriage of heavy-duty truck in China, signifying that China has achieved new breakthroughs in using composite materials in reducing the weight of automobile.

If a car's weight is reduced by 10 percent, its fuel efficiency can increase by 6 to 8 percent; when the overall weight of a car is cut down by 100kg, its oil consumption per 100km will be reduced by 0.3 to 0.6 liters; a reduction of 1 percent of the weight of a car can reduce oil consumption by 0.7 percent. Besides, a reduction of 100kg in a car's weight can reduce carbon dioxide emission by about 5 grams per kilometer.

World carbon fiber production is at about 60,000 tons now and could be 118,000 tons in 2017.

DARPA heads to Maker Faire to push low cost Brain Scanners to launch brain control of all devices

A working prototype of a low-cost electroencephalography device funded by the US Defense Advanced Research Projects Agency (DARPA) made its debut in New York this weekend, the first step in the agency’s effort to jumpstart a do-it-yourself revolution in neuroscience.

Dr. Lindsay Allen was sitting in a booth at Maker Faire, the annual gathering where 70,000-some hackers and tinkerers display their projects, wearing a blue cap stuffed with electrodes. Her brain’s electrical signals were feeding into a green chip the size of a baseball card mounted atop an inexpensive Arduino microcontroller, which was outputting through some software built for the demonstration. Beside her, a laptop streamed a chart of her brainwaves, which spiked and settled as she opened and closed her eyes.

Dr. Allen is an engineer for Creare, one of the companies DARPA has contracted with, and she was hooked up to OpenBCI, an open source device built to capture signals from eight electrodes at a time.

Ron Paul Defines Libertarianism as non-intervention in all things

Charlie Rose: "Define what libertarianism means to you" - Ron Paul: "The word I best describe it is something not a lot of people use. I call it non-intervention."

Charlie Rose: "Non-intervention in personal life, non-intervention in foreign policy, non-intervention in..."

Ron Paul: "There it is. Because it's sort of tells you what a conservative, a libertarian, constitutionalist, and liberal, classical liberalism has been used it's closely aligned with libertarianism, but non-intervention as you say. I don't want to interfere in your personal life. The one rule is you can't hurt another person, that's when government's necessary."

Ron Paul wants regulation by the market place. Fraud and many other things are not allowed.
Regulations become politicized.

We do not use prior restraint for speech. This would be censureship.

Prior restraint is to be avoided. In popular culture there is the fictional pre-crime in the Minority Report movie.

We can quickly stop the bad actors.

Allow corrections to happen. It is the only way to fix inbalances.



100 Year Starships

Centauri Dreams covered the 2013 100 year Starship Symposium

At the first track session for “Factors in Time and Distance Solutions,” Terry Kammash (University of Michigan) ran through the basics of the rocket equation to show why chemical rockets were inadequate for deep space travel. Kammash is interested in a fusion hybrid reactor whose neutron flux induces fission, a system that could eventually enable interstellar missions. It is based on Gas Dynamic Mirror (GDM) methods that surround a plasma-bearing vacuum chamber with a long, slender, current-bearing coil of wire. The plasma is trapped within magnetic fields that control the instability of the plasma. Here it’s worth mentioning that a Gas Dynamic Mirror propulsion experiment in 1998 produced plasma during a NASA test of the plasma injector system, injecting a gas into the GDM and heating it with microwaves in a method called Electronic Cyclotron Resonance Heating.

Gas Dynamic Mirror (GDM) Fusion Propulsion system has an engine. It is a long, slender, current-carrying coil of wire that acts like a magnet surrounds a vacuum chamber that contains plasma. The plasma is trapped within the magnetic fields created in the central section of the system. At each end of the engine are mirror magnets that prevent the plasma from escaping out the ends of the engine too quickly.

In 1998, the GDM Fusion Propulsion Experiment at NASA produced plasma during a test of the plasma injector system, which works similar to the forward cell of the VASIMR. It injects a gas into the GDM and heats it with Electronic Cyclotron Resonance Heating (ECRH) induced by a microwave antenna operating at 2.45 gigahertz. Researchers have continued experiments and theoretical work.



2010 lecture- Meeting the World's Energy Needs with the Fusion Hybrid Reactor by Terry Kammash



Dissertation related to Gas Dynamic Fusion Propulsion

Here is a 2011 dissertation (119 pages) - A Computational Magnetohydrodynamic Model of a Gas Dynamic Fusion space propulsion System

The goal of this body of work was to advance our understanding of gas dynamic mirror (GDM) fusion propulsion systems. Kammash’s analytical model suggested that deuterium−tritium (D−T) and deuterium−3 helium (D−3He) GDMs were feasible, but they were large at 250 to 100,000 metric tons with up to 75% of the mass accounted for by radiators rather than confinement magnets. Starting from that point, this effort has explored alternate GDM concepts, identified the challenges for modeling GDMs using computational MHD approaches, and found solutions to a number of those challenges.

Off topic Video Break

Joseph Gordon Levitt is best known for his roles in the Dark Knight Rises, Inception and the TV show Third Rock. Here he shows off his dancing talent.



Accelerator on a nanostructured chip accelerates electrons at a rate ten times higher than conventional technology

In an advance that could dramatically shrink particle accelerators for science and medicine, researchers used a laser to accelerate electrons at a rate 10 times higher than conventional technology in a nanostructured glass chip smaller than a grain of rice.

“We still have a number of challenges before this technology becomes practical for real-world use, but eventually it would substantially reduce the size and cost of future high-energy particle colliders for exploring the world of fundamental particles and forces,” said Joel England, the SLAC physicist who led the experiments. “It could also help enable compact accelerators and X-ray devices for security scanning, medical therapy and imaging, and research in biology and materials science.”

Because it employs commercial lasers and low-cost, mass-production techniques, the researchers believe it will set the stage for new generations of "tabletop" accelerators.

At its full potential, the new “accelerator on a chip” could match the accelerating power of SLAC’s 2-mile-long linear accelerator in just 100 feet, and deliver a million more electron pulses per second.

This initial demonstration achieved an acceleration gradient, or amount of energy gained per length, of 300 million electronvolts per meter. That's roughly 10 times the acceleration provided by the current SLAC linear accelerator.

“Our ultimate goal for this structure is 1 billion electronvolts per meter, and we’re already one-third of the way in our first experiment,” said Stanford Professor Robert Byer, the principal investigator for this research.

The key to the accelerator chips is tiny, precisely spaced ridges, which cause the iridescence seen in this close-up photo. (Brad Plummer/SLAC)

Arxiv - Laser-based acceleration of non-relativistic electrons at a dielectric structure

September 27, 2013

Mitochondria rejuvenation in mice gave memory and exercise performance like a young adult for elderly mice

Researchers took a naturally occurring mitochondrial transcription factor called TFAM, which initiates protein synthesis, and engineered it to cross into cells from the bloodstream and target the mitochondria.

Aged mice given modified TFAM showed improvements in memory and exercise performance compared with untreated mice. "It was like an 80-year-old recovering the function of a 30-year-old," says Rafal Smigrodzki, also at Gencia, who presented the results at the Strategies for Engineered Negligible Senescence conference in Cambridge this month.

Targeted mitochondrial therapeutics in aging (SENS 6)


An F-16 fighter jet has been converted into an unmanned drone

As a pilotless F-16 roared into the sky last week at Tyndall Air Force Base, Fla., members of Boeing’s QF-16 team and the U.S. Air Force celebrated.

Two U.S. Air Force test pilots in a ground control station at Tydall remotely flew the QF-16, which is a retired F-16 jet modified to be an aerial target. While in the air, the QF-16 mission included a series of simulated maneuvers, reaching supersonic speeds, returning to base and landing, all without a pilot in the cockpit.

Boeing has modified six F-16s into the QF-16 configuration.

Prior to the QF-16, the military used a QF-4 aircraft, which was a modification of the F-4 Phantom, a Vietnam-era fighter The modified QF-16 provides pilots a target that performs closer to many jets flying today

Why two European nuclear reactors will be 7 years late and triple the cost of the same Chinese reactors

The original contract between TVO and Areva-Siemens signed in 2003 envisioned the 1700 EPR nuclear plant would cost € 3.2 billion, with a completion date of 2009.

This would have been about $4.3 billion which is a bit more than the $3.8 billion cost of each of the two EPR reactors in China. The chinese EPR reactors appear to be completing on time and budget. They are identical european nuclear reactor designs. Most of the construction is using large modules and heavy equipment. There is not that much labour involved.

TVO announced in December 2011 that it anticipated the 1600 MWe plant to begin commercial operation in August 2014, some five years later than originally planned. By that same year, the anticipated costs of the plant had ballooned to €8.5 billion, according to data released by Areva. In July 2012, the company declared that the plant unit "will not be ready for regular electricity production in 2014.

The instrumentation and control (I and C) architecture of the Flamanville 3 EPR were only declared satisfactory by the ASN regulator in April 2012 and enable it to lift the reservations it expressed in October 2009.

In 2009, Areva has blamed the Finnish utility company, Teollisuuden Voima, which ordered the reactor, for the delays (up to that point). But the Finnish safety authority has said that Areva outsourced a number of aspects of the construction to unqualified subcontractors, making it responsible for a number of the problems.

The plants were about 80% complete in 2009, when the instrumentation and control (the electronic nervous system) had to be reworked.

The most recent delays appear to be the need to fix issues from the stress testing of the design.

Germany faces rising electricity costs and at least $270 billion for wind power with an exit from nuclear power

According to the Institute for Energy Research, this year German electricity rates will increase by over 10% due to a surcharge for using more renewable energy and a further 30 to 50% price increase is expected in the next ten years.

German electricity is already about triple the price of electricity in the USA and four times the price in Canada.

Without a change in course, says the government, costs could rise to 40 cents/kWh by 2020. At present-day prices, the average German family of three pays about 90 euros per month for electricity, the equivalent of about US $135—about twice as much as in the year 2000.

The government is investing heavily in onshore and offshore wind farms and solar technology in an effort to reduce 40% of greenhouse gas emissions by 2020.

Last year Chancellor Angela Merkel, who this week won her third term as Germany's leader, proposed to construct offshore wind farms in the North Sea, a plan that would cost 200 billion euros ($270 billion), according to the DIW economic institute in Berlin.

As part of the energy drive, Merkel also pledged to permanently shut down the country's 17 nuclear reactors, which fuel 18% of the country's power needs. Under Germany's Atomic Energy Act, the last nuclear power plant will be disconnected by 2022.

Limburg told CNN the rapid transition to renewables is economically "insane," arguing that wind farms will cost at least 13 times more than traditional coal plants.

North (Sea) wind farms can provide 25% of electricity production. Onshore wind could produce a higher share. Offshore wind energy is important for North Germany as already a lot of jobs have been created.

So $200 billion will be spent to replace the nuclear reactors which could be allowed continued operation.

History of Nuclear Power Costs

Several large nuclear power plants were completed in the early 1970s at a typical cost of $170 million, whereas plants of the same size completed in 1983 cost an average of $1.7 billion, a 10-fold increase. Some plants completed in the late 1980s have cost as much as $5 billion, 30 times what they cost 15 years earlier.

Inflation, played a role, but the consumer price index increased only by a factor of 2.2 between 1973 and 1983, and by just 18% from 1983 to 1988.

Inflation explains a bit more than doubling but does not explain why prices went up 4-5 times more.

What caused the remaining large increase? Ask the opponents of nuclear power and they will recite a succession of horror stories, many of them true, about mistakes, inefficiency, sloppiness, and ineptitude. They will create the impression that people who build nuclear plants are a bunch of bungling incompetents. The only thing they won't explain is how these same "bungling incompetents" managed to build nuclear power plants so efficiently, so rapidly, and so inexpensively in the early 1970s.

Weather analysis shows there was no plausible Fukushima scenario in which Tokyo, Yokosuka, or Yokota would have been subject to dangerous levels of airborne radiation

Lawrence Livermore National Laboratory calculated the worst case scenarios from the tsunami damaged Fukushima nuclear reactors.

Weather analysis shows there was no plausible Fukushima scenario in which Tokyo, Yokosuka, or Yokota would have been subject to dangerous levels of airborne radiation. Even if all of the reactors went critical and there was a worst case fire in holding pond 4 (the holding pond did not have that worst case fire and the reactors had less problems than the worst case.) The actual weather was even more favorable with wind mainly blowing out into the ocean. The oceans have 4 billion tons of uranium in their natural condition, so more radioactive material is literally a drop in the ocean.

Hours after a massive earthquake and tsunami struck Japan on March 11, 2011, a team of Livermore scientists mobilized to begin assessing the danger from the crippled Fukushima Dai-ichi nuclear plant. The 40-odd team members include physicists, meteorologists, computer modelers, and health specialists. Their specialty is major airborne hazards—toxic matter from chemical fires, ash from erupting volcanoes, or radioactive emissions.

After days of high-intensity analysis and numerous computer runs, the scientists concluded that radiation in Tokyo would come nowhere close to levels requiring an evacuation, even in the event that Fukushima Dai-ichi underwent the worst plausible meltdown combined with extremely unfavorable wind and weather patterns.

These revelations, together with additional new information, debunk some powerful myths about Fukushima and have weighty implications for the debate about nuclear power that has raged in the accident's aftermath. (The revelations are unrelated to the plant’s current water-leakage problem, which by some reckonings is less severe and more solvable than recent headlines suggest.)

A company is forming to develop the Hyperloop

After gaining steam on the collaboration platform JumpStartFund, a group of engineers set out to create a crowd-friendly company to make the Hyperloop happen.

Dirk Ahlborn and his co-founders have connections with SpaceX, they were able to talk over the idea with the company's president, Gwynne Shotwell, and get the green light to feature it on the platform. Now, after receiving more than 300 votes and the expressed interest of a number of potential collaborators, JumpStartFund has decided to move the project into the in-progress stage.

Joining the JumpStart team to make the Hyperloop company a reality and lead the project are engineers Marco Villa and Patricia Galloway. Villa was previously with SpaceX as the director of mission operations and in charge of the Dragon spacecraft project. Galloway was a member of the US National Science Board for six years before serving as its vice chair from 2008 to 2010, and has experience in national infrastructure planning as part of the National Construction Dispute Resolution Committee.

JumpStartFund is also accepting applications from members of the site to work full-time on the Hyperloop project in exchange for equity in the company.

Early Engineering simulation work has been done

A computer-based engineering simulation company Ansys -- who Musk employs for SpaceX simulations --has performed early simulations. These show that the Hyperloop design as it stands now -- first laid out by Musk's 58-page "alpha" design -- still needs massive tweaking to become a safe and viable mode of transport.


Ansys' initial round of computer simulations of the Hyperloop concept illustrate in red the high levels of stress, or the force pulling the vehicle backward, that would make current designs energy inefficient.
(Credit: Ansys)


September 26, 2013

Chinese telecom billionaire moving to build Nicaragua Canal

A Chinese telecom billionaire has signed a deal with the government of Nicaragua to build a $40 billion canal across Nicaragua.

Panama is now trying to complete a $5.25-billion reconstruction by mid-2015 that will greatly enhance the channel’s capacity. It’s not clear that the world’s largest shipping companies will continue, or resume, using the Panama route.

Earlier this year, Danish-owned Maersk Line, the planet’s biggest fleet of container vessels, announced it would cease traversing Panama owing to a combination of high tolls and uneconomical restrictions on ship size.

The size of container ships is measured in something called “20-foot equivalent units,” or TEUs, each of which approximates the length of a single cargo container. At present, the Panama Canal can handle vessels measuring up to 4,500 TEUs, a benchmark known in the industry by the abbreviation “Panamax.”

With the addition of two new systems of locks as well as the dredging of existing channels, the waterway will be able to handle ships measuring as much as 12,000 TEUs, which insiders already refer to as “New Panamax.”

There are already 18,000 TEUs cargo ships being built, while container ships of the near future may well be as large as 30,000 TEUs. Built from scratch, a Nicaragua canal could be far better equipped to handle the new seaborne giants.

The Nicaragua canal could start construction late in 2014 and complete by 2019.


Origami Batteries have 14 times higher energy density

Arizona researchers suggest that advances in geometric folding algorithms and computational tools to determine folding patterns for making complex 3D structures from planar 2D sheets may lead to numerous other configurations possible for 3D batteries. Furthermore, with advances in robot manipulation including paper folding by robots, the manufacturability of folded batteries at scale may be possible in the near future.

To prepare their batteries, the researchers used carbon nanotube (CNT) coated papers as the current collectors and deposited conventional active material layers (Li4Ti5O12 and LiCoO2) on top of them. They used Laboratory Kimwipes as substrates because the thin and porous nature of the paper allowed the CNT ink to diffuse easily both inside and outside of the paper. This resulted in CNT-coated papers that were conductive on either side.

Polyvinylidene difluoride (PVDF) was used as a binder to improve the CNT adhesion by coating an additional CNT/PVDF layer onto the CNT-coated papers prior to depositing the active materials.


Paper folding techniques are used in order to compact a Li-ion battery and increase its energy per footprint area. Full cells were prepared using Li4Ti5O12 and LiCoO2 powders deposited onto current collectors consisting of paper coated with carbon nanotubes. Folded cells showed higher areal capacities compared to the planar versions with a 5 × 5 cell folded using the Miura-ori pattern displaying a 14 times increase in areal energy density.

Four Longevity Scenarios for 2030s and beyond

Future Tense (Slate, New America, University of Arizona) will host an event in Washington, D.C., on how increases in human lifespan could transform public policy, society, and the economy.

They have four longevity scenarios for 2030.

Some pertinent historical facts that shape The Washington Longevity Scenarios for 2030:

* For the last 160 years, life expectancy in the developed world has been increasing like clockwork at the rate of a quarter of a year, every year, according to the U.N. Population Division. Thank technologies like clean water, sewage, reductions in child mortality, more and better nutrition, vaccines, antibiotics, and information-fueled advances against the current big killers—cancer, heart disease, and strokes.

* For the last half-century, the amount of computer firepower you can buy for $1 has been doubling every 18–24 months. This is called Moore’s Law.

* Recently, this curve of regularly doubling change has been matched or exceeded by the genetics, robotics, and nano revolutions that are spin-offs of information technology.

Scenario A: Small Change - adding 3-5 years of life expectancy based on continuing existing trends
Scenario B: Drooling on Their Shoes - lifespan is increased but healths gains lag.
Scenario C: Live Long and Prosper - 150 year lifespans

150 year lifespans could not be truly proved before about 2060-2070. The oldest living person is 115 years now. If the supercentenarians (110+ years old) stop dieing then a cohort could reach 150 by 2048. However, there was one person who lived to 122 years with accepted documentation. A few more reaching that age or a bit beyond to 125 years would not be accepted as heralding THE breakthrough. I think it would require many well documented people reaching 130 years of age or clear evidence of comprehensive health rejuvenation or dropping the death rate by 20% or more.

To add just 5 years to life expectancy, we would need to slash the mortality rate by more than 40%. To add 40 years, the mortality rate in the oldest people would have to be about 1.68%. To add 50 years, the mortality rate in the oldest people would need to drop to about 0.6% of current levels. This would be statistical evidence that in 2030, a massive breakthrough in longevity had occurred.

European nuclear reactors are three times the cost of the same european designed nuclear reactors in China

China has indicated that to 1700 MWe European Pressurized Reactors (EPR) cost $7.5 billion (50 billion yuan).

All of the heavy components have been installed on the chinese reactors and everything appears to be on schedule and budget for a late 2013 start for unit 1 and a 2014 start for unit 2.

On 3 December 2012 EDF announced that the estimated costs for the Flamanville, France EPR 1700MWe reactor escalated to €8.5 billion ($11 billion), and the completion of construction is delayed to 2016.

In December 2012, Areva estimated that the full cost of building the Olkilouto reactor will be about €8.5 billion, or almost three times the delivery price of €3 billion.

$22 billion for two EPR reactors in Europe (France and Finland) is about triple the $7.5 billion for the two chinese EPR reactors.

Debates should not treat money as scarce for energy research and infrastructure

People often debate energy and whether a few hundreds of million get spent on some energy research project or not. There is money all over energy, energy research, energy subsidies. Buying energy and building out energy infrastructure are the big items. Energy research and development (angels and venture capitalists) in the USA is about $10 billion per year. Worldwide it is about $50-100 billion (government and industry).

For a $6 trillion per year industry with an average of 2% for research, there should be $300 billion being spent on energy research. We should be trying to get a broader portfolio of research with potential solutions that could actually quickly scale. The problem with ITER (international fusion research) is not that it is hard research that has been running decades late. The problem is that even if it works it would not make energy cheaper and the buildout would take until 2050 to get started and around 2070 to make a significant impact.

People argue about the waste and some company or project not working out. Venture capitalists and angels expect 9 out of 10 investments to not work out. However, they are shooting for the big hits that have the potential to make a difference. Making cars that are 20% more fuel efficient or increasing the efficiency of a turbine by 5%, those are all achievable and low risk and incremental engineering. If we want to change the global energy mix significantly to make them 98% non-polluting or lower energy costs by 5 times within 30 years to boost the global economy then we need to be targeting projects that have the potential to deliver those kinds of gains from the design. We then need to be ready to massively and rapidly scale the true winners.

China success with solar and wind and hydro was not in developing the research. China won by massive scaling of the factories and deployment. Not the billion or two on research but the $100 billion to trillions on scaling.

We should be targeting clean low cost energy abundance. Globally spending $600 billion per year on energy research. The top countries in research (Japan, Israel) by percent of GDP spend 3.5-4.2% on research. Get energy right and the world can double the entire economy. Instead of targeting double the world energy and world economy in 2040, we could target triple or quadruple. With energy that is four times cheaper then we have a lot of energy for clean water and other ways to change the world for the better. We would have the energy and money for space.

Energy is money. The energy efficiency of an economy is slow changing relationship between GDP per unit of energy used.

Natural gas has become low cost and abundant because of fracking and horizontal drilling. This shows that technology can shift energy markets and the cost of energy within a few years.

We can target transforming nuclear fission energy with factory mass produced deep burn (burn 99+% of the uranium, plutonium or thorium) with a cost of 1 cent or less per kWh. I believe that Terrestrial Energy Integral Molten Salt Reactor could get down to 0.86 cents per kWh.

The 25 MWe version of the IMSR is the size of a fairly deep hottub

September 25, 2013

Photons molecules behave like light sabers with photons that push and deflect each other

Harvard and MIT researchers have managed to coax photons into binding together to form molecules – a state of matter that, until recently, had been purely theoretical.

The discovery, Lukin said, runs contrary to decades of accepted wisdom about the nature of light. Photons have long been described as massless particles which don't interact with each other – shine two laser beams at each other, he said, and they simply pass through one another.

"Photonic molecules," however, behave less like traditional lasers and more like something you might find in science fiction – the light saber.

"Most of the properties of light we know about originate from the fact that photons are massless, and that they do not interact with each other," Lukin said. "What we have done is create a special type of medium in which photons interact with each other so strongly that they begin to act as though they have mass, and they bind together to form molecules. This type of photonic bound state has been discussed theoretically for quite a while, but until now it hadn't been observed.

"It's not an in-apt analogy to compare this to light sabers," Lukin added. "When these photons interact with each other, they're pushing against and deflect each other. The physics of what's happening in these molecules is similar to what we see in the movies."

Nature - Attractive photons in a quantum nonlinear medium

Dwave Systems scales up production of Quantum Computing systems with a deal with a semiconductor fab

D-Wave had the 512 qubit D-Wave 2 machine, which claims to be able solve some real-world problems faster than any other option available.

NASA Ames, Lockheed and Google have both bought D-Wave machines.

“We can make 120 quantum chips at a time, on an eight-inch wafer,” Colin Williams director of partnerships at D-Wave told TechWeekEurope. Williams joined D-Wave from NASA’s Jet Propulsion Laboratory (JPL) which built earlier chips for D-Wave. The newer generations are now built by a silicon fab in Minnesota – which presumably means it is Cypress Semiconductor.

“While academics get about two designs of quantum computer a year, we can manage six to eight,” he said. “That is what differentiates us from the academic approach.”

D-Wave also uses a distinctive approach to quantum computing – quantum “annealing” – which has caused some controversy. The machine is designed to solve optimisation problems where the user wants to find the best solution out of many millions of possibilities. The D-Wave machine solves a particular one of these “NP-hard” problems, in the following way. It is designed so that the solution to the problem will have the lowest energy state: it starts in a coherent state in which the possible solutions exist simultaneously and then settles until only one state exists – the solution.

It’s called annealing because of the similarity with metallic solids which can gradually settle to a lower energy state as their crystalline structure re-aligns. “If you can solve one NP-hard problem well, you can transform others onto it,” said Williams. This means that quantum computers could potentially be used for jobs like financial modelling and analysing the structures of proteins.



Carbon nanotube computer about equal to the Intel 4004

For the first time, researchers have built a computer whose central processor is based entirely on carbon nanotubes.

The carbon nanotube computers, created by Stanford University engineers, show that carbon nanotube electronics are a viable potential replacement for silicon when it reaches its limits in ever-smaller electronic circuits.

The carbon nanotube processor is comparable in capabilities to the Intel 4004, that company’s first microprocessor, which was released in 1971, says Subhasish Mitra, an electrical engineer at Stanford and one of the project’s co-leaders. The computer runs a simple software instruction set called MIPS. It can switch between multiple tasks (counting and sorting numbers) and keep track of them, and it can fetch data from and send it back to an external memory.

The nanotube processor is made up of 142 transistors, each of which contains carbon nanotubes that are about 10 to 200 nanometer long. The Stanford group says it has made six versions of carbon nanotube computers, including one that can be connected to external hardware—a numerical keypad that can be used to input numbers for addition.



Nature - Carbon nanotube computer

Global warming is irreversible without massive geoengineering according to IPCC

Global warming is irreversible without massive geoengineering of the atmosphere's chemistry. This stark warning comes from the draft summary of the latest climate assessment by the Intergovernmental Panel on Climate Change.

Even if all the world ran on carbon-free energy and deforestation ceased, the only way of lowering temperatures would be to devise a scheme for sucking hundreds of billions of tonnes of carbon dioxide out of the atmosphere.

Petrobanks THAI fireflooding oilsand oil recovery is not commercially productive so they will try Multi-THAI

After several years of production in situ, it has become clear that current THAI (Toe Heal Air Injection) methods do not work as planned. Amid steady drops in production from their THAI wells at Kerrobert, Petrobank has written down the value of their THAI patents and the reserves at the facility to zero. They have plans to experiment with a new configuration they call "multi-THAI," involving adding more air injection wells.

It is estimated that approximately 90% of the Alberta oil sands (1.75 trillion barrels of oil bitumen in place) are too far below the surface to use open-pit mining. Several in-situ techniques have been developed. Steam assisted gravity drainage is the most successful method and THAI fireflooding was another method.


Using Petrobank’s patented THAI technology, air is injected into the reservoir through a vertical well to spark combustion of some of the oil, creating a heated chamber that warms the remaining oil and allows it to flow to the toe of a horizontal well to be produced.

Petrobank said it now believes “permeability channels” are forming in the reservoir that take the air away from the combustion zone, preventing the zone from expanding.

It has initiated water co-injection in several wells with limited results and is experimenting with “multi-THAI,” adding more air injection wells along each horizontal well to expand the combustion zone. It said it expects the first “multi-THAI” wells to begin producing in the fall.

September 24, 2013

China's high speed rail successfully move more than double the passengers that use domestic airlines

The New York Times covers China high speed rail system.

Just five years after China’s high-speed rail system opened, it is carrying nearly twice as many passengers each month as the country’s domestic airline industry. With traffic growing 28 percent a year for the last several years, China’s high-speed rail network will handle more passengers by early next year than the 54 million people a month who board domestic flights in the United States.

China’s high-speed rail system has emerged as an unexpected success story. Economists and transportation experts cite it as one reason for China’s continued economic growth when other emerging economies are faltering.

Chinese workers are now more productive. A paper for the World Bank by three consultants this year found that Chinese cities connected to the high-speed rail network, as more than 100 are already, are likely to experience broad growth in worker productivity. The productivity gains occur when companies find themselves within a couple of hours’ train ride of tens of millions of potential customers, employees and rivals.

Focus Fusion impurities measured and indicate that the tungsten electrode will provide a performance boost in a few months

Lawrenceville Plasma Physics (LPP) researchers believe that the main problem impeding higher density and yields in LPP’s FF-1 device is metal impurities in the plasma. A new optical-UV spectrometer from Ocean Optics, has been used to measure the impurities and confirmed what they are and their amounts. This important step forward was taken with the able help of summer Research Associate Kyle Lindheimer, a student at Penn State University, under the direction of LPP Lab Director Derek Shannon.

They have found more than enough silver to disrupt the plasma filaments and prevent higher density plasmoids and more fusion yield. The solution is to eliminate the silver and copper in the electrodes and replace them with tungsten.

This will take a few months to design and fabricate the tungsten electrode. After this is done there should be several times improvement in performance.

Crowdfunding preparations are underway

LPP dense plasma focus fusion researchers are also preparing to launch a crowdfunding effort to raise funds for their commercial fusion project.

Laser stairway to reduce energy requirements and focusing problems for laser pushed solar sails

Centauri Dreams covered the Charles Quarra Light Bridge (laser stairway) which was presented at the 2013 Starship Congress (48 pages).

Preprints of the paper is here

The ‘laser starway’ concept Charles Quarra present is a natural evolution of the work of both Robert Forward and Geoffrey Landis in extending the reach of beamed power into deep interstellar space, by taming the beam divergence that is ever present in all laser wavefronts. Beamed power gives us the possibility of leaving the source of energy at home, avoiding the exponential blowout of energy requirements imposed by the Tsiolkovsky rocket equation. But beamed propulsion is far from devoid of issues: The pointing accuracy, the huge laser sources and sails tens or hundreds of kilometers wide demand engineering capabilities that are still far from our current horizon.

Conceptually, the starway tries to push the idea of multiple lenses for beam refocusing, analyzed by Landis in the 1990s, in the direction of making them reusable: Can we take a string of lenses, deploy them between two stars and keep them operational for long periods?

The principal parameter that determines the available power for sail propulsion on the light bridge structure is the optical efficiency of the relays: As the optical efficiency of the relays stays below the critical value (which for a starway made of thousands of nodes will imply losses per node around 100 ppm) the efficiency of the laser thrust utilization grows rapidly.


NASA Curiosity Rover detects no methane which reduces the chance of finding life on Mars

Data from NASA's Curiosity rover has revealed the Martian environment lacks methane. This is a surprise to researchers because previous data reported by U.S. and international scientists indicated positive detections.

The roving laboratory performed extensive tests to search for traces of Martian methane. Whether the Martian atmosphere contains traces of the gas has been a question of high interest for years because methane could be a potential sign of life, although it also can be produced without biology.

This result reduces the probability of current methane-producing Martian microbes, but this addresses only one type of microbial metabolism. There are many types of terrestrial microbes that don't generate methane.

Given the sensitivity of the instrument used, the Tunable Laser Spectrometer, and not detecting the gas, scientists calculate the amount of methane in the Martian atmosphere today must be no more than 1.3 parts per billion. That is about one-sixth as much as some earlier estimates.

This picture shows a lab demonstration of the measurement chamber inside the Tunable Laser Spectrometer, an instrument that is part of the Sample Analysis at Mars investigation on NASA's Curiosity rover.
Image Credit: NASA/JPL-Caltech


Cygnus docking with Space Station delayed for about a week due to a software error

Orbital’s Cygnus spacecraft was into the final leg of berthing with the International Space Station (ISS) on Sunday morning, prior to a discrepancy relating to the way the ISS and Cygnus determine GPS data. The fascinating issue can be fixed via an update to Cygnus’ software, allowing for a second rendezvous and berthing attempt no earlier than Saturday.

The next attempt – per a call to the ISS crew – will not take place before Saturday, so as to avoid conflicting with the arrival of the next Soyuz vehicle.

“The earliest possible date for the next Cygnus approach and rendezvous with the ISS would be Saturday, September 28. An exact schedule will be determined following the successful completion of Soyuz operations.”

Compact Foldable electric bikes

Stigo is the world’s fastest-folding electric scooter that can be taken along wherever one wishes to go – a restaurant, apartment, on public transportation or a small elevator.

This novel, max 25 km/h electric scooter weighs only 17 kg and its footprint is mere 45×40 cm when folded.

The Stigo is priced at €2,370 (about US$3,200), and is claimed to be street legal in its target market, where it's classed as a limited performance scooter. Its developers plan to ship the first 200 to Europe by Q2 - Q3 of 2014, hoping to increase production to 8,000 units by 2015.

This would be great is the cost was about 4 to 8 times less. There are some options below that are in the $700-1000 range and nearly as compact.





Alzheimer’s missing link found: Is a promising target for new drugs

Yale School of Medicine researchers have discovered a protein that is the missing link in the complicated chain of events that lead to Alzheimer’s disease, they report in the Sept. 4 issue of the journal Neuron. Researchers also found that blocking the protein with an existing drug can restore memory in mice with brain damage that mimics the disease.

In the Neuron paper, the Yale team reveals the missing link in the chain, a protein within the cell membrane called metabotropic glutamate receptor 5 or mGluR5. When the protein is blocked by a drug similar to one being developed for Fragile X syndrome, the deficits in memory, learning, and synapse density were restored in a mouse model of Alzheimer’s.

Strittmatter stressed that new drugs may have to be designed to precisely target the amyloid-prion disruption of mGluR5 in human cases of Alzheimer’s and said his lab is exploring new ways to achieve this.


Neuron - Metabotropic Glutamate Receptor 5 Is a Coreceptor for Alzheimer Aβ Oligomer Bound to Cellular Prion Protein

September 23, 2013

DARPA exosuit and European Exoskeleton strive to reduce injuries from carrying heavy loads

DARPA seeks to create a working supersuit prototype that significantly boosts endurance, carrying capacity and overall Soldier effectiveness-all while using no more than 100 watts of power. They want it to be somfortable, durable and washable. The garment would not interfere with body armor or other standard clothing and gear.

In fiction like Iron Man, the exoskeleton makes the wearer more powerful than a tank. In reality the suits will make factory workers and soldiers more productive and reduce injuries from carrying heavy objects. They will be like more active and updated versions of weight lifting belts.

The most common injuries soldiers face are from carrying their gear-often topping 100 pounds-for extended periods over rough terrain. Heavy loads increase the likelihood of musculoskeletal injury and also exacerbate fatigue, which contributes to both acute and chronic injury and impedes Soldiers' physical and cognitive abilities to perform mission-oriented tasks.


Europe also has a 4.5 million euro program to make a human-guided exoskeleton to improve work safety and enhance productivity in the industrial environment.

According to the Work Foundation Alliance (UK), as many as 44 million workers in the European Union are affected by work-place related musculoskeletal disorders (MSDs), representing a total annual cost of more than 240 billion Euros. To overcome these industrial and societal challenges, a new project, called Robo-Mate, has been designed.

China leases 9% of Ukraine's arable farmland for 50 years

China is to lease 3 million hectares (7.4 million acres) of Ukrainian farmland. China’s official Xinjiang Production and Construction Corps has signed an agreement with Ukrainian agricultural firm KSG Agro, which would see Ukraine provide 100,000 hectares to China. That would eventually rise to 3 million hectares.

This would be 11,583 square miles of Ukrainian land over the span of 50 years—which means the eastern European country will give up about 5% of its total land, or 9% of its arable farmland to feed China’s burgeoning population.

The 50-year plan was mainly aimed at growing crops and raising pigs.

In 2009, China had a total of just over 2 million hectares of farming land abroad.

The 3 million hectares is about equal the land area of Massachusetts. It is also about the size of Belgium.

Action to reduce particulate and ozone pollution can save 500,000 lives per year by 2030 and about 1.3 million per year by 2050

Researchers used a comprehensive analysis with global modeling methods that looks at relationships between deaths and exposure to particulate matter and ozone air pollution. They found that 500,000 premature deaths per year could be avoided by the year 2030, of which two-thirds would be in China. By 2050, 800,000 to 1.8 million premature deaths could be avoided. Fixing air pollution in China and east asia will provide 10 to 70 times the benefit versus the cost of the action.

Nature Climate Change - Co-benefits of mitigating global greenhouse gas emissions for future air quality and human health

In the next two years, China announced that they will reduce the sulfur content of fuel. This will increase gasoline costs by 290 yuan (46.8 U.S. dollars) per ton and diesel by 370 yuan per ton. There are about 240 gallons in one ton. So the cost will increase about 1.1 yuan (US$0.18) per gallon of gasoline or about 1.5 yuan per gallon of diesel.

After 2017, the prices of motor gasoline and diesel that meet the national "fifth-phase" standard will be lifted further by 170 yuan per ton and 160 yuan per ton, respectively, said the NDRC.

The State Council, China's Cabinet, has mandated that sulphur content for both gasoline and diesel be set at no more than 10 ppm (parts per million) by 2017, a reduction from the fourth-phase standard of 50 ppm. The fifth standard is equal to the European 5 standard.

China also announced air pollution limits. Under the new plan, concentrations of fine particulate matter must be reduced by 25 percent in the Beijing-Tianjian-Hebei area in the north, 20 percent in the Yangtze River Delta in the east and 15 percent in the Pearl River Delta in the south, compared with 2012 levels. All other cities must reduce the levels of larger particulate matter, known as PM 10, by 10 percent. The plan said Beijing must also bring its average concentration of PM 2.5 down to 60 micrograms per cubic meter or less. That would be two and a half times the recommended exposure limit set by the World Health Organization.

China will spend about 2 trillion yuan to combat air pollution over the next 5 years.

Nextbigfuture has shown that reducing soot is the most cost effective and fastest way to improve the environment and reduce global warming.


Carnival of Nuclear Energy 175

The Carnival of Nuclear Energy 175 is up Deregulate the Atom


ANS Nuclear Cafe - Are Nuclear Plant Closures Due to Market Manipulation and Decommissioning Fund Rules?

Many are having a hard time understanding Entergy’s decision to schedule the closing of the Vermont Yankee nuclear power plant next year. Jim Hopf examines the issues, as well as policy adjustments that could help prevent a similar situation in the future.

Changes are being made to rules governing power grids that seem to be deliberately designed to harm the profitability of baseload (i.e., coal and nuclear) power plants. John Wellinghoff, the head of the Federal Energy Regulatory Commission (which is involved with issues related to power grids and markets), has often proclaimed that baseload power is a thing of the past that is no longer needed. Well, it seems like his vision may be coming true, some of this likely due to the policy changes discussed above. These changes will act to reduce the role of coal and nuclear baseload plants and replace them with “flexible” gas generation capacity.

Canadian Energy Issues compares Ontario and Germany’s energy. Whose energy is cleaner ? Steve Aplin of Canadian Energy Issues crunches the numbers and compares Ontario real time data with Germany pre-nuclear phaseout. His results will disappoint and perhaps outrage those who think that Germany is a paragon of clean electricity.

Germany's electricity has nearly five times as much high carbon versus the electricity of Ontario, Canada.

Software upgrade enbles the Rethink Baxter robot to operate faster and perform new tasks

Rethink Robotics have released what they claim is a game-changing software update of the company’s flagship interactive production robot, Baxter. The 2.0 software will introduce a new set of applications, making Baxter an even more effective productivity tool for American manufacturers. Existing customers will be able to easily download the new software and upgrade their Baxter robot.

The Robot report says that Rethink Robotics’ Baxter robots are being produced and sold at a rate of 500+ units for 2013.

Baxter is now able to pick and place parts at any axis, allowing the robot to perform a broad array of new tasks, such as picking objects off of a shelf, or loading machines in a horizontal motion. The 2.0 software also allows the customer to define waypoints with increased accuracy; users will be able to define the exact trajectory that they want Baxter’s arms to follow simply by moving them. For example, the robot can be taught where to move its arms in and out of a machine. In addition, the 2.0 software enables customers to train Baxter to hold its arms in space for a predetermined amount of time, or until a signal indicates they can begin moving again. This makes Baxter useful for holding parts in front of scanners, inspection cameras or painting stations, and for working more interactively with other machines (i.e., moving its hand out of a machine while it cycles).



September 22, 2013

Mode-locked Lasers Applied to Deflecting a Near Earth Object on Collision Course with Earth

Arxiv - Mode-locked Lasers Applied to Deflecting a Near Earth Object on Collision Course with Earth

We consider synchronized trains of sub-picosecond pulses generated by mode-locked lasers applied to deflection of near Earth objects (NEO) on collision course with Earth. Our method is designed to avoid a predicted collision of the NEO with Earth by at least the diameter of Earth. We estimate deflecting a 10,000 MT NEO, such as the asteroid which struck Earth near Chelyabinsk, Russia to be feasible within several months using average power in the ten kilowatt range. We see this deflection method as scalable to larger NEO to a degree not possible using continuous laser systems.

Three trains of synchronized sub-picosecond optical pulses (light blue, green, and yellow Gaussian TEM00 modes) directed by microspacecraft (white hexagonal boundaries) exert propulsive thrust slowing a NEO

Quantum Information Processing at the Attosecond Timescale

Arxiv - Quantum Information Processing at the Attosecond
Timescale


Coherent processing of quantum information and attosecond science had so far little in common. We here show that recent data in high harmonic emission reveal quantum information processing at the attosecond timescale. High harmonic generation in the strong-field regime is governed by tunneling ionization followed by the motion of the electron in the continuum and its re-collision in the atomic core. Before the actual photon detection the electron-photon state exhibits a high degree of quantum coherence and entanglement, that has so far remained elusive. By observing the interference pattern created by the spatiotemporal overlap of photons emitted by two interfering electron paths we generate a photon Hadamard gate and thus erase the electron-trajectory information. This allows the measurement of the relative phase in electron-trajectory quantum superpositions and establishes the era of electron-photon quantum coherence and entanglement at the attosecond timescale of high-field physics.

Carnival of Space 320

1. Chandra X-ray space telescope blog reports a team of astronomers has discovered enormous arms of hot gas in the Coma cluster of galaxies by using NASA's Chandra X-ray Observatory and ESA's XMM-Newton. These features, which span at least half a million light years, provide insight into how the Coma cluster has grown through mergers of smaller groups and clusters of galaxies to become one of the largest structures in the Universe held together by gravity.