Pages

May 03, 2014

Nextbigfuture open thread

This is an open thread. Contribute any interesting links or open discussion topics and make suggestions. Remember to be polite and courteous. Debate the topics with information and do not attack other commenters

May 02, 2014

Stanford creates million neuron brain emulator chips that are 100,000 times more energy efficient than a PC simulation and switching to updated fabrication will bring the cost to $400 per million neuron system

Stanford bioengineers have developed faster, more energy-efficient microchips based on the human brain – 9,000 times faster and using significantly less power than a typical PC. This offers greater possibilities for advances in robotics and a new way of understanding the brain. For instance, a chip as fast and efficient as the human brain could drive prosthetic limbs with the speed and complexity of our own actions.

Boahen and his team have developed Neurogrid, a circuit board consisting of 16 custom-designed "Neurocore" chips. Together these 16 chips can simulate 1 million neurons and billions of synaptic connections. The team designed these chips with power efficiency in mind. Their strategy was to enable certain synapses to share hardware circuits. The result was Neurogrid – a device about the size of an iPad that can simulate orders of magnitude more neurons and synapses than other brain mimics on the power it takes to run a tablet computer.

Each of the current million-neuron Neurogrid circuit boards cost about $40,000. Boahen believes dramatic cost reductions are possible. Neurogrid is based on 16 Neurocores, each of which supports 65,536 neurons. Those chips were made using 15-year-old fabrication technologies.

By switching to modern manufacturing processes and fabricating the chips in large volumes, he could cut a Neurocore's cost 100-fold – suggesting a million-neuron board for $400 a copy. With that cheaper hardware and compiler software to make it easy to configure, these neuromorphic systems could find numerous applications.

Neurogrid: A Mixed-Analog-Digital Multichip System for Large-Scale Neural Simulations

Nanostructured ceramic is 50% Harder than current spinel armor for Armor Windows and will make smartphone screens better than Sapphire

The Department of Defense needs materials for armor windows that provide essential protection for both personnel and equipment while still having a high degree of transparency. To meet that need, scientists at the U.S. Naval Research Laboratory (NRL) have developed a method to fabricate nanocrystalline spinel that is 50% harder than the current spinel armor materials used in military vehicles. With the highest reported hardness for spinel, NRL's nanocrystalline spinel demonstrates that the hardness of transparent ceramics can be increased simply by reducing the grain size to 28 nanometers. This harder spinel offers the potential for better armor windows in military vehicles, which would give personnel and equipment, such as sensors, improved protection, along with other benefits.


Spinel windows can have applications as electro-optical/infrared deckhouse windows in the new class of U.S. Navy destroyers, like the USS Elmo Zumwalt pictured above, that feature a low radar signature compared with current vessels. (U.S. Navy photo courtesy of General Dynamics)

Acta Materialia - An extended hardness limit in bulk nanoceramics

Element 117 aka ununseptium has been created

Atoms of a new super-heavy element — the as-yet-unnamed element 117 — have reportedly been created by scientists in Germany, moving it closer to being officially recognized as part of the standard periodic table.

Researchers at the GSI Helmholtz Center for Heavy Ion Research, an accelerator laboratory located in Darmstadt, Germany, say they have created and observed several atoms of element 117, which is temporarily named ununseptium.



May 01, 2014

Seven minute video where Woodward explains Mach principle for mastering inertia for near lightspeed propellentless travel and stargates

Dr. James F. Woodward explains how to build faster-than-light warp drives and even stargates, based on Mach's principle which would allow spacetime distortions compatible with general relativity. In his research lab at California State University, Fullerton, he tries to demonstrate Mach effects in various experiments to validate the theory.

Excerpt from "Ancient Aliens: Aliens and Stargates", Season 6, Episode 12, January 24, 2014.
The video thumbnail is from the front cover of Jim Woodward's book "Making Starships and Stargates: The Science of Interstellar Transport and Absurdly Benign Wormholes", Springer Publishing, 2013:

Mach effect: warp drives and stargates by Jim Woodward from flux_capacitor on Vimeo.


Cochlear Implant Also Uses Gene Therapy to Improve Hearing and electrode triggered gene therapy could improve other machine body connections

The electrodes in a cochlear implant can be used to direct gene therapy and regrow neurons.

The researchers behind the work are investigating whether electrode-triggered gene therapy could improve other machine-body connections—for example, the deep-brain stimulation probes that are used to treat Parkinson’s disease, or retinal prosthetics.

More than 300,000 people worldwide have cochlear implants. The devices are implanted in patients who are profoundly deaf, having lost most or all of the ear’s hair cells, which detect sound waves through mechanical vibrations, and convert those vibrations into electrical signals that are picked up by neurons in the auditory nerve and passed along to the brain. Cochlear implants use up to 22 platinum electrodes to stimulate the auditory nerve; the devices make a tremendous difference for people but they restore only a fraction of normal hearing.

Pig hearts could be transplanted into humans after baboon success

A genetically engineered pig heart which was transplanted into a baboon has survived more than a year without being rejected, leading scientists to hope that animal parts could one day provide a limitless sources of organs.

The hearts of genetically modified pigs could be transplanted into humans to solve the shortage of organ donors, scientists believe.

Researchers successfully grafted a pig heart into a baboon more than a year ago and it is still functioning, they report today.

Until now, organs transplanted into primates have only lasted for a maximum of six months before being rejected.

But scientists have tweaked the DNA of pigs so that their hearts are more compatible with primates and humans.

Through genetic changes, the scientists have added several human genes to the pig genome as well as removing genes which trigger a dangerous immune response in humans.

Grafts from these genetically engineered pigs are less likely to be seen as foreign, thus reducing the immune reaction against them.

Spacex will be trying to get the Dragon crew rated in 2014 and they have won an injunction against Lockheed and Boeing blocking purchase of Russian rocket engines

A U.S. Court of Federal Claims judge issued an injunction late Wednesday prohibiting a joint venture between Lockheed Martin and Boeing from proceeding with plans to buy Russian-made rocket engines.

SpaceX sued the federal government Monday, protesting the Air Force’s award of a lucrative space contract, saying it should have been competitively bid.

In the suit, SpaceX criticizes United Launch Alliance (ULA) for using Russian engines in some of its rockets, which SpaceX founder Elon Musk said might be a violation of U.S. sanctions and was unseemly at a time when Russia “is the process of invading Ukraine.”

Musk alleged that the deal would benefit Dmitry Rogozin, the deputy prime minister who heads the Russian defense industry and is named by the U.S. government in the sanctions.

In reaction to the sanctions, Rogozin tweeted: “After analyzing the sanctions against our space industry, I suggest the U.S. delivers its astronauts to the ISS [International Space Station] with a trampoline.”

Elon Musk tweeted Sounds like this might be a good time to unveil the new Dragon Mk 2 spaceship that @SpaceX has been working on w @NASA," Musk wrote of the company's manned Dragon capsule currently in development. "No trampoline needed."

Elon Tweeted Cover drops [crewed Dragon] on May 29. Actual flight design hardware of crew Dragon, not a mockup.

The crewed Dragon mockup was shown in 2013

Problems with the Big Bang Expanding Universe Theory

In a startling challenge to the widely-popular Big Bang theory, new evidence, to be published this week in the International Journal of Modern Physics, D, indicates that the universe is not expanding after all. The evidence, based on detailed measurements of the size and brightness of hundreds of galaxies, adds to a growing list of observations that contradict the predictions of the increasingly complex Big Bang model.

The new research tested one of the striking predictions of the Big Bang theory: that ordinary geometry does not work at great distances. In the space around us, on earth, in the solar system and the Milky Way, as similar objects get farther away, they look fainter and smaller. Their surface brightness, that is the brightness per unit area, remains constant. In contrast, the Big Bang theory tells us that in an expanding universe objects actually should appear fainter but bigger. Thus in this theory, the surface brightness decreases with the distance. In addition, the light is stretched as the universe expanded, further dimming the light. So in an expanding universe the most distant galaxies should have hundreds of times dimmer surface brightness than similar nearby galaxies, making them actually undetectable with present-day telescopes.

The researchers carefully compared the size and brightness of about a thousand nearby and extremely distant galaxies, using images from the GALEX satellite for nearby ones and from the Hubble Space Telescope for distant ones. They chose the most luminous spiral galaxies for comparisons, matching the average luminosity of the near and far samples. Contrary to the prediction of the Big Bang theory, they found that the surface brightnesses of the near and far galaxies are identical.

The Tolman surface brightness test is one out of a half-dozen cosmological tests that was conceived in the 1930s to check the viability of and compare new cosmological models. Tolman's test compares the surface brightness of galaxies as a function of their redshift.

A previous study of the relationship between surface brightness and redshift was carried out using the 10m Keck telescope to measure nearly a thousand galaxies' redshifts and the 2.4m Hubble Space Telescope to measure those galaxies' surface brightness. The exponent found was not 4 as expected in the simplest expanding model, but 2.6 or 3.4, depending on the frequency band. The authors summarize:

We show that this is precisely the range expected from the evolutionary models of Bruzual and Charlot. We conclude that the Tolman surface brightness test is consistent with the reality of the expansion.

Graphene nanoribbons proposed to make thermoelectric materials with efficiency better than gasoline engines to make solid state engines feasible

Researchers propose a hybrid nano-structuring scheme for tailoring thermal and thermoelectric transport properties of graphene nanoribbons. Geometrical structuring and isotope cluster engineering are the elements that constitute the proposed scheme. Using first-principles based force constants and Hamiltonians, we show that the thermal conductance of graphene nanoribbons can be reduced by 98.8% at room temperature and the thermoelectric figure of merit, ZT, can be as high as 3.25 at T = 800 K. The proposed scheme relies on a recently developed bottom-up fabrication method, which is proven to be feasible for synthesizing graphene nanoribbons with an atomic precision.

Thermoelectic materials convert heat to electricity.
Getting a thermoelectric figure of merit over 3.0 is a technological holy grail like a room temperature superconductor.


Nature Scientific Reports -A bottom-up route to enhance thermoelectric figures of merit in graphene nanoribbons

Currently the best materials (Tin Selenide) in the lab have a ZT of 2.6.

Extracellular matrix from pigs are regenerating muscle to enable wounded veterans to walk again

Researchers had the idea of giving wounded muscle cells a healing boost with a substance that normally surrounds cells — the extracellular matrix. "The matrix can be thought of simply as the glue that holds all of the different cells in different tissues together," Badylak says. "However, in addition, there are all these hidden signals in the matrix that instruct the cells on what to do.

They surgically transplanting a quilt of matrix cells derived from pig bladders into the legs of patients whose muscles had been partially destroyed. He started with five patients — Strang and four other men who were disabled despite the best physical therapy and the best that medicine had to offer.

Before the experimental treatment, "some of them could not get out of a chair without help," Badylak says. "Some of them walked with a cane. ... This was not just a mild loss of strength. They had real problems."

After successful treatment with the matrix, one patient "now [rides] mountain bikes and does jumping jacks."

Another patients recovery hasn't been quite that dramatic. But his limp is gone. He can walk without a cane and hardly ever falls anymore. In short spurts, he can even run.

They have treated about a dozen patients, and have plans to try the technique with dozens more, hoping that, if all goes well, many doctors will be able to use the same approach to help many more patients whose muscles have been destroyed.

Science Translational Medicine - An Acellular Biologic Scaffold Promotes Skeletal Muscle Formation in Mice and Humans with Volumetric Muscle Loss

Brown Dwarf Stars with Arctic temperatures between -48 to -13 degrees Celsius

There could be a whole category of nearby, cool stellar objects which could be referred to as ‘Luhman objects' A Brown Dwarf object that is thought to be between -48 to -13 degrees Celsius, colder than previous record holders, which were found to be close to room temperature. WISE imagery from 2010 was confirmed by two additional images taken by Spitzer in 2013 and 2014, with further observations at the Gemini South telescope on Cerro Pachon in Chile. The WISE and Spitzer data were used to measure the distance to the object via parallax. It turns out to be 7.2 light years away, fitting nicely into the chart below, which shows the Sun’s immediate neighborhood.

(H/T Centauri Dreams)

This diagram illustrates the locations of the star systems closest to the sun. The year when the distance to each system was determined is listed after the system’s name. NASA’s Wide-field Infrared Survey Explorer, or WISE, found two of the four closest systems: the binary brown dwarf WISE 1049-5319 and the brown dwarf WISE J085510.83-071442.5. NASA’s Spitzer Space Telescope helped pin down the location of the latter object. The closest system to the sun is a trio of stars that consists of Alpha Centauri, a close companion to it and the more distant companion Proxima Centauri. Credit: NASA/Penn State University.

Arxiv - Discovery of a ~250 K Brown Dwarf at 2 pc from the Sun (8 pages)

There appear to be no economic miracle countries after China and if these countries do not make big progress then they will be trapped in being old and poor

Various economic forecasts do not show any countries consistently strongly growing faster than a slowed 6-7% GDP growth pace from China. China had mainly 9-14% GDP growth for 40 years and it took that long to get to 25% of the US per capita GDP.

The Conference Board forecasts the economies of China and India will slow from 2014-2019 to 2020-2025 (from 5.9 to 3.5 percent in China and 4.8 to 3.6 percent in India). Overall, emerging economies’ growth will slow to 3.2 percent on average during 2020-2025 from 4.3 percent during 2014-2019.

Citi economic forecast see maybe about 6% GDP growth for India and Indonesia. Those countries with the younger demographics should be at 9-12% GDP growth to get anywhere near the China economic catchup. The higher growth rate and the policy improvements and infrastructure buildout has to be sustained for 2-3 decades at least.

6% GDP growth in the better countries in Asia and Africa is obviously better than doing worse. But this is just beyond stagnation for developing countries with 1-2% population growth. They are doing a little better than treading water against developed countries.

Demographic Clock

In 30-50 years, these countries in Asia and Africa appear on track to lose their demographic advantages of having young and growing populations. They will then have the big problem that China is barely avoiding. Being old and poor as a nation.

Indonesia and India with 5-6% GDP growth will take until about 2030 to catchup the per capita GDP PPP of Egypt.

The bottom half are not growing robustly economically. The population of Africa is still increasing strongly. Asia population is flattening out. The Asian countries with non-breakout economic growth will be stuck with old and relatively poor populations. It is less than half the world's population behind China in 2011 but that population is growing.

If India can improve economic policy and Modi wins decisively then India might have 6.5% GDP growth for 5 years

Crisil Research in its recent report says it expects the Indian economy to grow by 6.5 percent annually between 2014-15 and 2018-19. However, it clearly states this kind of growth will be possible only if there is a decisive mandate in the Lok Sabha polls. This is good news for the Indian economy considering the International Monetary Fund (IMF) has predicted around 4 percent global growth for calendar years 2014-18. However, GDP growth at 6.5 percent is in no way close to the 9 percent growth seen 2003-04 to 2010-11, which was briefly hit by the global financial crisis in 2008-09.

The Conference Board forecasts the economies of China and India will slow from 2014-2019 to 2020-2025 (from 5.9 to 3.5 percent in China and 4.8 to 3.6 percent in India). Overall, emerging economies’ growth will slow to 3.2 percent on average during 2020-2025 from 4.3 percent during 2014-2019.

China has been catching up economically with double digit GDP growth in most years for the last 40 years and China is still only 25% of the per capita income of the USA.

India with 6.5% GDP growth and 1.0-1.2% population growth will not be catching up with China and will barely be making any progress compared to the USA at 2.0-3.5% GDP growth. It would take about 50-70 years for India to double its relative GDP compared to the USA if India is struggling to get to 6.5% GDP growth.

The World Bank International comparison program of 2011 indicated that India had 37% of the US economy in 2011 in PPP (Purchasing power parity terms). However, catching up that 2.5 times will be difficult.

April 30, 2014

For the first time since Abraham Lincoln was President the United States will not be the World Largest Economy

After about 150 years, China is again the largest economy in the world. The US has been the largest economy in the world since about the time of President Abraham Lincoln or maybe Rutherford Hayes (1880).

7 Score and 14 years ago

Madisons GDP estimates go back to 1820. Madsison's GDP numbers indicate that China has been the world's largest economy since about 2009. According to Maddison series, China’s 2010 (when the series ends) GDP is $PPP 10.7 trillion, and US GDP is $PPP 9.4 trillion. China overtook the US in 2009, thus ending a period that began around 1860, when US overtook….whom? China!, to become the number one world economy.

Note: some economic historians believe the US became the number one economy in 1872 and took over that title from the UK. GDP estimates of today can be plus or minus 20-40% as the the 25% adjustment shows. So there is even more difficulty estimating the GDP 150 years ago. However, Maddison and other economic historians did thorough work and so did those involved in the 2011 ICP.

Now the World Bank, IMF and University of Pennsylvania (Penn World Tables) are broadly agreeing on the PPP GDP.

History of Purchasing Power Parity

Purchasing power parity (PPP) is a disarmingly simple theory that holds that the nominal exchange rate between two currencies should be equal to the ratio of aggregate price levels between the two countries, so that a unit of currency of one country will have the same purchasing power in a foreign country. The PPP theory has a long history in economics, dating back several centuries, but the specific terminology of purchasing power parity was introduced in the years after World War I during the international policy debate concerning the appropriate level for nominal exchange rates among the major industrialized countries after the large-scale inflations during and after the war (Cassel, 1918). Since then, the idea of PPP has become embedded in how many international economists think about the world. For example, Dornbusch and Krugman (1976) noted: “Under the skin of any international economist lies a deep-seated belief in some variant of the PPP theory of the exchange rate.”

There is a 24 page paper on "The Purchasing Power Parity Debate"

The consensus view of the PPP debate—that short-run PPP does not hold, that long-run PPP may hold in the sense that there is significant mean reversion of the real exchange rate.

New Chart of Estimated purchasing power parity GDP for the USA and China

The Economist has a new chart of US vs China Purchasing power parity GDP based upon the new 2011 comparison of pricing



April 29, 2014

IMF and World Bank corrects Purchasing power parity GDP by about 25% for China and India which adds a Germany of GDP to China and makes China number one in 2014 for PPP GDP

The 2011 World Bank International Comparison Program are the most authoritative estimates of what money can buy in different countries. In 2005, the ICP thought China’s economy was 43 per cent of the US GDP. Because of the new methodology – and the fact that China’s economy has grown much more quickly – the research placed China’s GDP at 87 per cent of the US in 2011. The IMF expecting China’s economy to have grown 24 per cent between 2011 and 2014 while the US is expected to expand only 7.6 per cent, China is likely to overtake the US this year. Also, Hong Kong and Macau were not included in China's total and would add about $450 billion.

The 104 page Summary of the 2011 World Bank International Comparison program are here. The final report will be available in June, 2014.

India becomes the third-largest economy having previously been in tenth place. The size of its economy almost doubled from 19 per cent of the US in 2005 to 37 per cent in 2011.

Using the 2005 GDP purchasing power parity China had 75% of the GDP of the US in 2012 and about 70% in 2011. The 2011 PPP numbers are a 25% increase. The Telegraph UK Ambrose Evans-Pritchard and Clyde Prestowitz talked about China never catching the USA on GDP (even PPP GDP). They were wrong and wrong in articles in 2013.

Ben Chu wrote less than two months ago - checking the International Monetary Fund’s latest forecasts I [Ben Chu] noticed that the great oriental sorpasso [China passing the US on PPP GDP] has, apparently, been put on hold. What happened to China overtaking the US? It happens this year in 2014 with the 2011 ICP PPP GDP adjustments. The 2005 comparisons were wrong because they used the higher prices in major cities like Shanghai.

China's IMF per capita GDP 2005 PPP was $10,695 in 2014 but will now be $13,300 GDP per capita with 2011 PPP. This is close to the level of Brazil's adjusted per capita PPP GDP.

China would have about $20,000 per capita PPP GDP in 2019 which is about the level of Mexico.



Ghost Ship Nuclear Fusion Spaceship Concept Design

Project Icarus had a competition to create interstellar concept designs. The outline parameters were based on the project ToRs (Terms of Reference) for a mainly fusion spacecraft, on a 100 year mission with up to 150 tonnes payload (given the unavoidable size of your typical fusion spacecraft).

The winner was the Ghost Ship.

The Ghost Ship uses one single fusion propulsion stage for acceleration and deceleration. Deceleration is further supported by a magnetic sail system, which uses the drag of interstellar hydrogen acting on a magnetic field which decelerates the spacecraft. The fusion propulsion system is based on Deuterium – Deuterium inertial confinement fusion. Inertial confinement fusion is based on compressing a tiny pellet of fusion fuel by an ignition system, in our case a number of lasers. The fuel is compressed by these very high-power lasers to such a degree that fusion can occur. The team decided to use Deuterium – Deuterium, as Deuterium – Tritium would use large amounts of Tritium, which decays quite rapidly. This means that a prohibitively large amount of Tritium has to be stored on-board of the spacecraft. Deuterium – Helium 3 was discarded due to the difficulties associated with mining Helium 3 from the Moon or the gas giant planets.

The fusion ignition system is based on the fast ignition scheme. The beauty of this ignition scheme lies in the decoupling of compression and ignition of a fuel pellet. Without decoupling, a lot of energy is needed to create fusion conditions within the pellet purely by compression. It is like igniting a rod of dynamite by pinching it. It is possible but you need to pinch it very strongly. What you use instead is a “fuse”: a secondary high-power laser, which pierces the pellet and ignites it. In this way, you get the same amount of energy out of the pellet by using a lot less energy for compression and ignition.


Tyler Cowen points out Piketty Problems

Tyler Cowen makes the following points about Piketty's Capitalism book.

Overall, the main argument is based on two (false) claims. First, that capital returns will be high and non-diminishing, relative to other factors, and sufficiently certain to support the capital returns greater than world growth story as a dominant account of economic history looking forward. Second, that this can happen without significant increases in real wages.

Instead of wealth taxes - A more sensible and practicable policy agenda for reducing inequality would include calls for establishing more sovereign wealth funds, which Piketty discusses but does not embrace; for limiting the tax deductions that noncharitable nonprofits can claim; for deregulating urban development and loosening zoning laws, which would encourage more housing construction and make it easier and cheaper to live in cities such as San Francisco and, yes, Paris; for offering more opportunity grants for young people; and for improving education. Creating more value in an economy would do more than wealth redistribution to combat the harmful effects of inequality.

China is trying to ramp up energy from natural gas and nuclear energy

China is increasing natural gas and nuclear energy usage to reduce the use (or at least slow the use of coal)

One shorter-term clean-energy target—increasing natural gas to 10% of the power mix by 2020, from about 5% last year—might be achievable. New pipeline supplies from Central Asia, Myanmar and possibly Russia, higher output from China's own offshore reserves, exploitation of its huge onshore deposits—trapped in shale—and an increase in long-haul ship-borne liquefied-natural-gas deliveries could add up to enough gas.

By the end of 2013, China's wind-power capacity exceeded 75 million kilowatts, No. 1 in the world. Its solar-power capacity passed 15 million kilowatts and was growing faster than any other country's, according to Liu Zhenya, chairman of Chinese utility State Grid Corp. Still, China is struggling to meet its 2015 target of getting 11.4% of its electricity from such nonfossil fuels, officials said in December, despite heavy government subsidies.

China aims to raise its nuclear capacity to 200 gigawatts by 2030, from only 14.6 gigawatts last year. But it probably won't reach that goal, energy consultancy Wood Mackenzie forecast in a report Monday—which will offer opportunities for mining companies to supply huge amounts of additional coal to make up the power shortfall.

Technology constraints, inadequate infrastructure for uranium-fuel fabrication and disposal, public opposition to inland nuclear plants, and shortages of qualified personnel all mean a more realistic nuclear capacity in 2030 will be 175 gigawatts.

JET plans deuterium-tritium fusion tests

JET are planning a new DT [deuterium-tritium] campaign aimed at trying to get maximum performance and a new fusion yield record for MCF [Magnetic confinement fusion].

As a bit of context, DT runs are a pain in the neck to run - you have all kinds of hazmat and health physics red tape so doing a DT campaign isn't the norm - and consuming tritium is a problem as there is a finite stock on site at Culham with a facility for recovery and recycling (i.e. there is a finite number of DT shots JET can run without getting a new delivery of tritium, which is difficult and complex). That means DT campaigns are done relatively rarely.

Generally these are used to look at how population of alphas produced by fusion will drive behaviour in the plasma. Fast particles generate currents and generate micro instabilities, some good - like bootstrap current that drives better confinement, some bad. Understanding that, and how it scales, is more important to the technology development right now than seeing if you can get more neutrons out. The experiments to understand that tend to pull in the opposite direction of turning all the dials to maximum: you want to understand how individual variables affect the overall plasma scenario

Now ITER is moving forward, and they have started to run out of useful things to do with JET so running a DT campaign using everything that has been learnt since the 80's about improved modes of confinement (like generating improved internal transport barriers etc.) means this is likely to be the next biggest test under close as live conditions you will get until ITER to benchmark tokamak performance.

April 28, 2014

The World Is Not Running Out of Resources

Matt Ridley gives fives reasons the world is not running out of resources

1. More Productive Land

Economists point out that we keep improving the productivity of each acre of land by applying fertilizer, mechanization, pesticides and irrigation. Further innovation is bound to shift the ceiling upward. Jesse Ausubel at Rockefeller University calculates that the amount of land required to grow a given quantity of food has fallen by 65% over the past 50 years, world-wide.

Debate on Inequality and Pickety Capitalism

. The Piketty and Saez data for the U.S. indicate that between 1979 and 2012, the bottom 90 percent’s income dropped by over $3,000. However, the official Census Bureau estimates indicate that the bottom 80 percent of households saw an increase of nearly $3,500. Median income—the income of the household in the middle of the distribution—rose by $2,500.

The Census Bureau figures are superior to the Piketty and Saez estimates when looking below the top ten percent in two ways. First, the measure of income derived from tax returns excludes a significant amount of income, and people below the top are disproportionately recipients of that income. Most importantly, in the United States, most public transfer income is omitted from tax returns. That includes not just means-tested programs for poor families and unemployment benefits, but Social Security. Many retirees in the Piketty-Saez data have tiny incomes because their main source of sustenance is rendered invisible in the data. The Census Bureau figures include some transfers, though even they omit non-cash transfers like food stamps, school lunches, public housing, Medicare, and Medicaid.

One can use the Census Bureau data to estimate trends in market income for households with a head under age sixty (and so unlikely to be retired). Among those with any market income, I find an increase of $3,400 in the median (using the same cost-of-living adjustment as the Census Bureau and Piketty and Saez). This estimate does not include the value of employer-provided health insurance or other fringe benefits and does not include capital gains either.

The second reason that tax return data are inferior to Census Bureau estimates for incomes below the top is that tax returns—or “tax units,” which essentially means potential tax returns if everyone filed—are different from households. The Piketty and Saez data include as tax units all returns filed by dependent teenagers with summer jobs and undergraduates with work-study positions. They count roommates and unmarried partners as separate tax units rather than as one household, ignoring all of the shared living expenses that make living with someone cheaper than living alone. As a consequence, incomes are much lower among tax units than among households.

Incorporate these improvements using the Census Bureau data, we find that median post-tax and -transfer income rose by nearly $26,000 for a household of four ($13,000 for a household of one) between 1979 and 2012. If you don’t like the household-size adjustment, the non-adjusted increase was over $20,000 at the median. If you think that valuing health care as income is problematic, that figure drops to $10,400 under the implausible assumption that third-party health care benefits have no value to households. The income of the bottom 90 percent rose nearly $12,000 under that assumption instead of dropping by $3,000 as in the Piketty and Saez data, and it rose by nearly $21,000 if health benefits are included. For a household of four, median market income for non-elderly households (not counting employer-provided health care as income) rose $9,400.

Piketty seems to draw too strong a conclusion (“terrifying,” in his words) about what continued rising inequality would entail for the bottom 90 percent (at least in the U.S.). Rising income concentration has not been accompanied by stagnation below the top, and there is no reason to think that it will be in the future

Carnival of Space 351

The Carnival of Space 351 is up at Universe Today

Brownspaceman.com - What are white holes? It's just a theory for now and possibly all it will ever be however, why is that? Here we take a look at the definition of a white hole and what we understand about them.


Carnival of Nuclear Energy 206

The Carnival of Nuclear Energy 206 is up at Hiroshima Syndrome


Atomic insights reviews the MIT floating nuclear reactor design

April 27, 2014

Lithium Sulfur batteries last for more charge cycles using a nickel-based metal organic framework cathode

Today's electric vehicles are typically powered by lithium-ion batteries. But the chemistry of lithium-ion batteries limits how much energy they can store. As a result, electric vehicle drivers are often anxious about how far they can go before needing to charge. One promising solution is the lithium-sulfur battery, which can hold as much as four times more energy per mass than lithium-ion batteries. This would enable electric vehicles to drive farther on a single charge, as well as help store more renewable energy. The down side of lithium-sulfur batteries, however, is they have a much shorter lifespan because they can't currently be charged as many times as lithium-ion batteries. Pacific Northwest National Laboratory researchers have developed a nickel-based metal organic framework to hold onto polysulfide molecules in the cathodes of lithium-sulfur batteries and extend the batteries’ life spans.

(H/T New Energy and Fuel)

Metal organic frameworks — also called MOFs — are crystal-like compounds made of metal clusters connected to organic molecules, or linkers. Together, the clusters and linkers assemble into porous 3-D structures. MOFs can contain a number of different elements. PNNL researchers chose the transition metal nickel as the central element for this particular MOF because of its strong ability to interact with sulfur.

During lab tests, a lithium-sulfur battery with PNNL's MOF cathode maintained 89 percent of its initial power capacity after 100 charge-and discharge cycles. Having shown the effectiveness of their MOF cathode, PNNL researchers now plan to further improve the cathode's mixture of materials so it can hold more energy. The team also needs to develop a larger prototype and test it for longer periods of time to evaluate the cathode's performance for real-world, large-scale applications.

PNNL is also using MOFs in energy-efficient adsorption chillers and to develop new catalysts to speed up chemical reactions.

"MOFs are probably best known for capturing gases such as carbon dioxide," Xiao said. "This study opens up lithium-sulfur batteries as a new and promising field for the nanomaterial."


A new, PNNL-developed nanomaterial called a metal organic framework could extend the lifespan of lithium-sulfur batteries, which could be used to increase the driving range of electric vehicles. Publicly available for use with the credit line, "Courtesy of Pacific Northwest National Laboratory."

Nano Letters - Letter
Lewis Acid–Base Interactions between Polysulfides and Metal Organic Framework in Lithium Sulfur Batteries


DARPA Progress to enable assembly of more flexible, scalable and cost-effective space systems on orbit

The process of designing, developing, building and deploying satellites is long and expensive. Satellites today cannot follow the terrestrial paradigm of “assemble, repair, upgrade, reuse,” and must be designed to operate without any upgrades or repairs for their entire lifespan—a methodology that drives size, complexity and ultimately cost. These challenges apply especially to the increasing number of satellites sent every year into geosynchronous Earth orbit (GEO), approximately 22,000 miles above the Earth. Unlike objects in low Earth orbit (LEO), such as the Hubble Space Telescope, satellites in GEO are essentially unreachable with current technology.

DARPA created the Phoenix program to help address these daunting challenges. Phoenix seeks to change the current paradigm by enabling GEO robotics servicing and asset life extension, while developing new satellite architectures to reduce the cost of space-based systems. Specifically, Phoenix’s goal is to develop and demonstrate technologies that make it possible to inspect and robotically service cooperative space systems in GEO and to validate new satellite assembly architectures. Phoenix has achieved promising Phase 1 results and has awarded eight companies prime contracts for its Phase 2 efforts.



DARPA developing UAVs to provide 1 gigabit per second backbone to the front lines

UAV Mobile Hotspots program makes progress toward goal of providing 1 Gb/s communications backbone to deployed units.

Missions in remote, forward operating locations often suffer from a lack of connectivity to tactical operation centers and access to valuable intelligence, surveillance, and reconnaissance (ISR) data. The assets needed for long-range, high-bandwidth communications capabilities are often unavailable to lower echelons due to theater-wide mission priorities. DARPA’s Mobile Hotspots program aims to help overcome this challenge by developing a reliable, on-demand capability for establishing long-range, high-capacity reachback that is organic to tactical units. The program is building and demonstrating a scalable, mobile millimeter-wave communications backhaul network mounted on small unmanned aerial vehicles (UAVs) and providing a 1 Gb/s capacity. DARPA performers recently completed the first of three phases in which they developed and tested key technologies to be integrated into a complete system and flight tested in subsequent phases.

“We’re pleased with the technical achievements we’ve seen so far in steerable millimeter-wave antennas and millimeter-wave amplifier technology,” said Dick Ridgway, DARPA program manager. “These successes—and the novel networking approaches needed to maintain these high-capacity links—are key to providing forward deployed units with the same high-capacity connectivity we all enjoy over our 4G cell-phone networks.”



DARPA's Chip-Sized Digital Optical Synthesizer to Aim for Routine Terabit-per-second Communications

DARPA’s new Direct On-chip Digital Optical Synthesizer program seeks to do with light waves what researchers in the 1940s achieved with radio microwaves. Currently, optical frequency synthesis is only possible in laboratories with expensive racks of equipment. If successful, the program would miniaturize optical synthesizers to fit onto microchips, opening up terahertz frequencies for wide application across military electronics systems and beyond.

“The goal of this program is to make optical frequency synthesis as ubiquitous as microwave synthesis is today,” said Robert Lutwak, DARPA program manager. “There are significant challenges, but thanks to related DARPA programs POEM, Quasar, ORCHID, PULSE and E-PHI and other advanced laboratory research, technology is at the tipping point where we’re ready to attempt miniaturization of optical frequency synthesis on an inexpensive, small, low-power chip.”

The basic concept is to create a “gearbox” on a chip that produces laser light with a frequency that is a precise multiple of a referenced radio frequency, such as is readily available within most existing DoD and consumer electronic systems. The ability to control optical frequency in a widely available microchip could enable a host of advanced applications at much lower cost, including:

* High-bandwidth (terabit per second) optical communications
* Enhanced chemical spectroscopy, toxin detection and facility identification
* Improved light detection and ranging (LiDAR)
* High-performance atomic clocks and inertial sensors for position, navigation and timing (PNT) applications
* High-performance optical spectrum analysis (OSA)

Today, optical communications employ techniques analogous to those of pre-1940 AM radio, due to the inability to control frequency precisely at optical frequencies, which are typically 1,000 times higher than microwaves. The higher frequency of light, however, offers potential for 1,000-fold increase in available bandwidth for communications and other applications.