March 14, 2009

Rethinking Financial Regulation: Simple Transparency, Open Source and XML

Even if you have open access to current public information, the warehouse with the Lost Ark of Indiana Jones is not simply transparent. We need open public policing of the financial system. Creating an empowered social network where the community of investors can help protect each other from being ripped off. Empowering a global financial neighborhood watch. Wired Magazine publishes a specific open regulation proposal.

The SEC's public document database, Edgar, now catalogs 200 gigabytes of filings each year—roughly 15 million pages of text—up from 35 gigabytes a decade ago.

But the volume of data obscures more than it reveals; financial reporting has become so transparent as to be invisible. Answering what should be simple questions—how secure is my cash account? How much of my bank's capital is tied up in risky debt obligations?—often seems to require a legal degree, as well as countless hours to dig through thousands of pages of documents. Undoubtedly, the warning signs of our current crisis—and the next, are online somewhere in all those filings, but good luck finding them.

1. We must require public companies and all financial firms to report more granular data online—and in real time, not just quarterly—uniformly tagged and exportable into any spreadsheet, database, widget, or Web page. The era of sunlight has to give way to the era of pixelization; only when we give everyone the tools to see each point of data will the picture become clear.

Banks have to issue Free Writing Prospectus (FWP) Every bank that issues mortgage-backed securities—pools of home loans packaged together and sold as a single entity—is required to file a free writing prospectus, which lists every individual mortgage in each pool. An FWP contains endless columns of pure data, most of which don't even track from page to page. And each FWP is different. It took five professionals three months to take FWP data and standardize it so that it could be comparable. The result of that work is private. It is only for the use of the company who hired thos professionals so that the hiring company has an edge.

2. Today, nearly 50 companies report their information in XBRL to the SEC. XBRL stands for eXtensible Business Reporting Language. It is a version of XML.

A few years ago, when banking regulators started requiring filings in XBRL from its member banks, it found that the time it took auditors to review a bank's quarterly financial information dropped from about 70 days to two. More regulators are catching on: Last December, the SEC announced that by June, every company with a market capitalization over $5 billion will be required to submit all filings using the format. And all publicly traded companies and mutual funds must follow suit by 2011.

Open Source Lending Enables Lower Default

Lending Club provides the data on its customers and posted the formula it uses to measure default risk and determine the interest rates its borrowers had to pay. After receiving a slew of suggestions, the site's engineers decided to modify the equation, assigning less weight to debt-to-income ratio, for instance. LendingClub's default rate is a staggeringly low 2.7 percent (versus nearly 5.5 percent for prime credit cards).

XBRL is a language for the electronic communication of business and financial data which is revolutionising business reporting around the world. It provides major benefits in the preparation, analysis and communication of business information. It offers cost savings, greater efficiency and improved accuracy and reliability to all those involved in supplying or using financial data.

XBRL stands for eXtensible Business Reporting Language. It is one of a family of "XML" languages which is becoming a standard means of communicating information between businesses and on the internet.

94th Carnival of Space

The 94th Carnival of Space is up at Out of the Cradle

This site contributed the article about the many space stations and space station modules that China is planning from 2010 to 2020.

The National Space Society Blog looks at "Can Space save the American and Global Economy?" The Space advocate makes the case

I am not talking about a quick fix, 10 year program like going to the moon in the 60’s. I am talking about a permanently established infrastructure system dedicated to the infinite reaches of space.

My solution tackles a few of the major issues confronting America today, such as jobs, education, health care and international relations.

Mass production is what drives prices down.

For example, the rovers Spirit and Opportunity cost a few hundred million each just to design and build, but how much would that cost drop if you mass produced them, in assembly line fashion, with “plug-in” ports to hold vast arrays of scientific instruments? If you punched out a few hundred rovers a year in this format, the cost would drop significantly. Now marry this idea with long range rockets designed the same way (mass produced) and we could have rovers and probes exploring our solar system by the hundreds every year and where the costs are no more drastic than the cost of maintaining commercial airliners. Can you imagine the amount of science and discovery that would provide just in and of itself?

Centauri Dreams looks at space travel up to a century out.

Check out the 94th Carnival of Space for a lot more articles on Mars and Black holes and much more.

SENS Anti-aging Progress

From Future Currents by Jeriaska: at the BIL unconference in February, 2009, Chief Science Officer Aubrey de Grey gave an overview of the research projects that the organization is now funding, their significance to SENS, and their potential to lead to accelerated progress towards the defeat of aging in 2009 and beyond.

There is a new Methuselah Foundation Undergraduate Research Initiative.

These are the four things that are really new projects in the SENS foundation.

1. A fantastic, originally Serbian immunologist called Janko Nikolich-Zubich, who is a prominent gerontologist and works in Tucson at the University of Arizona, has become very interested in the possibility of being more ambitious about repairing and rejuvenating the immune system than anyone has previously been. There are two major things that go wrong with the immune system during aging and they fall into two of seven categories that I always talk about. People have been exploring these things in isolation in a somewhat halfhearted sort of way for quite some time, but no one has had the balls to do them together.

I have managed to persuade Janko to do this. He is basically applying a combination therapy to mice whose immune systems are going downhill because of aging and seeing whether the immune systems can be really rejuvenated so that the mice are better at resisting infection, getting back to where they were in early adulthood. It is a reasonably long project, as is more or less any project involving the aging of mice, but it is already underway. It is being funded by the Methuselah Foundation and we are extremely happy about it.

2. Jan Vijg is a professor at Albert Einstein in New York and another very prominent gerontologist. He is another strong supporter of the general principle that I have been putting forward for the past two years. His main theme is the accumulation of nuclear mutations. (This is not mutations in our mitochondria, but mutations in our chromosomes.) He has for some time taken the view that the accumulation of non-specific mutations, mutations that randomly disable some aspect of the cell’s function, may actually be a major driving force in the rate of aging. I think he’s wrong. If I thought he was right, then I would be much more pessimistic about our ability to do much about aging anytime soon, because let’s face it, it’s pretty tricky to mend mutations in the nucleus. We are probably going to need molecular nanotechnology before we can do that.

I think we are lucky that actually the only thing that really matters as a consequence of nuclear mutations is cancer. As many of you who are familiar with my work will remember, I have a specific approach to dealing with cancer. My take is, if we can really get that working, then all the other things that the accumulation of nuclear mutations might cause will not actually matter for a long time—several times a currently normal lifespan. We can afford not to worry about that for a little while.

Jan, as I said, disagreed with me on this. He claims the mutations may matter in a normal lifespan, but he is such a damn good scientist and such a nice guy that he is doing an experiment that he did not want to do, and he’s doing it for me. Basically, he is having a look at the brain to see whether the brain accumulates what we like to call epimutations. These are not changes to the DNA sequence, but changes to the decorations of that sequence: things like methylation of histones, methylation of CpG dinucleotides. These are things that cause changes to which proteins are actually expressed from the genome, as opposed to which proteins could be expressed.

Jan found some years ago that in the brain actual bona fide mutations do not accumulate at all in mice during the whole of adulthood. They accumulate during growth, but after it the mouse gets to full size and nothing happens. Of course, if nothing is happening, then it cannot matter in aging. For this reason, it is very important to determine whether the same is true for epimutations. I think it probably will be, and he is having a go. If he finds that there is some change, then of course we have to find out whether there is enough change to actually matter. That is a whole other set of experiments.

That, again, is underway now. I should say that for both these projects, the people actually doing the work are not the professors, but the people working for the professors. In both cases we have an absolutely splendid person that each of these people have brought in. I am really happy that two accomplished and talented post-doctoral fellows are actually doing the experimental legwork here. I am delighted at how these projects have got going over the past few months since we started funding them. It really is happening.

3. Now, demography is not usually listed in my seven strands of what we need to deal with in order to defeat aging. However, as all of you know who have tried to talk to people who are not terribly persuaded that this would be a good idea, it is really important to think about the social consequences of defeating aging so as to be able to call the bluff of the idiots who actually say that aging is a good thing. Of course the biggest reservation that people normally have when you talk about defeating aging is “Oh dear, there will be lots of people, won’t there? That will be terrible.”

I have lots of fairly sarcastic answers to this, of course. However, it would be nice to actually have some data. What we have decided to do is create a really authoritative study, along with some software that can be used as the substrate for creating other studies, that will allow us to see what the demographic consequences would be of developing therapies that generally knocked aging on its head. This would be on the basis of various other permutations and assumptions, like how rapidly technology accommodates increasing population, how rapidly the birthrate declines, how rapidly these therapies spread around the world when they actually arrive, all those things.

The Gavrilovs, Natalia and Leonid, are professors in the University of Chicago. They are, again, very good supporters of this whole mission. They are some of the most pro-anti-aging, so to speak, demographers in the world, in marked contrast to other demographers. They are doing this work for us, and furthermore, it is not very expensive. It does not involve test tubes. I put this down on the bottom rather cheekily, because we have not strictly speaking signed the contract yet. We will probably do so next week.

4. Lenhard Rudolph is a professor in Ulm, a city in Germany. He is one of the experts in the manipulation of telomerase in mice, the enzyme that maintains the length of chromosomes. He has done some good work in this area over the years. In particular, he is the world’s leading specialist on the manipulation of mice with respect to the blood. Now, the blood is a tissue which is going to suffer if we do what I think we need to do in order to really defeat cancer. It is going to have side effects on the ability of stem cells in the blood to actually continue to work indefinitely.

The way that I propose we are going to get around that is by periodic replenishment of the bone marrow through the stem cells to the blood, but we need to determine that this is actually going to work. This is the world’s best person to do this experiment, and he is going to do it. I am pretty happy that all these things are happening, and it is mainly because of Peter Thiel. I bow down to his generosity.

March 13, 2009

New E-Bomb and EMP Device Details

There are several new approaches to E-bombs and EMP devices. The Army is creating hybrid devices by adding e-bombs to regular explosives.

1. There are shockwave ferromagnetic generators for e-bomb and EMP devices. This is a magnet that blows up and spontaneously demagnetizes, releasing energy as a pulse of power. The effect is known as pressure-induced magnetic phase transition, and only occurs with some types of magnets in certain situations.

The researchers moved on to more exotic lead zirconate titanate magnets. This enabled them to reduce the volume of the power generator from 50 cu. cm. (3 cu. in.) to 3 cu. cm., excluding explosives. Army requirements call for assembly of the power generator, power conditioning and aerial in a 1-in. space. Power output will be measured in hundreds of megawatts for microseconds.

2. There are also completely explosive ultracompact high-voltage nanosecond pulse-generating systems. They are one-fifth of a cubic inch.

3. Allen Stults of Amrdec is using the jet of ionized plasma produced by the explosion as an antenna.

The new munitions will have two crucial advantages over previous e-bombs: they are small, and should not cause electronic "friendly fire" casualties hundreds of meters away. And because they still have the same blast, fragmentation and armor-piercing properties as they did, commanders can be confident that they're not wasting space carrying rounds that might have no effect.

An enhanced warhead could knock out a tank even if it did not penetrate. The vehicle would be left without ignition, communications or other electronics. A warhead would also knock out other electronic systems, including mobile phones used by insurgents to detonate bombs and circuitry in rocket-propelled grenades.

Two candidate munitions for upgrade are the Tow missile and 2.75-in. rockets fired by helicopter. This is unlike previous e-bomb efforts, which have focused on large air-delivered bombs or unitary artillery munitions that cover a large area, what Kopp terms “weapons of electrical mass destruction.”

A small e-bomb will be qualitatively different than larger versions. Radiated power falls off with the square of distance, so a target 3 meters (10 ft.) away receives 100 times the effect of one 30 meters away. An EMP-enhanced Tow missile would produce a pulse strong enough to destroy what it hits, but should not disrupt electronics over a wide area.

The smallest weapon that the Army is looking to upgrade is the M77 bomblet fired by the Multiple Launch Rocket System (MLRS). A bomblet has a shaped-charge warhead and throws out antipersonnel fragments. Bomblets cover a wide area—one launcher can fire a 12-rocket salvo blanketing an area the size of six football fields—and are used against soft targets. An EMP-enhanced version would cover the same area, providing even destruction over the target zone.

If the M77 can be upgraded, shoulder-launched rockets and similar weapons could be modified to produce an EMP. Small infantry rockets have limited effectiveness against modern armor. An EMP-enhanced round might not penetrate but could provide a “soft kill” capability that immobilizes a vehicle. This damage is hard to repair and would probably require the replacement of electronic systems.

New Metamaterial Nanocups Brings Superlenses, ultra-efficient solar cells and invisibility closer

Gold nanocups at their magnetoinductive resonance have the unique ability to redirect scattered light in a direction dependent on cup orientation, as a true three-dimensional nanoantenna.

Nanocups are just what they sound like: very tiny, cup-shaped particles. What makes them special is their ability to bend light. Halas and Mirin have found a way to make material incorporating nanocups that can bend light in a specific direction.

Redirecting scattered light means none of it bounces off the metamaterial back into the eye of an observer. That essentially makes the material invisible. "Ideally, one should see exactly what is behind an object," said Mirin.

"The material should not only retransmit the color and brightness of what is behind, like squid or chameleons do, but also bend the light around, preserving the original phase information of the signal."

Halas said the embedded nanocups are the first true three-dimensional nano-antennas, and their light-bending properties are made possible by plasmons.

Using nanocup metamaterial to transmit optical signals between computer chips has potential, she said, and enhanced spectroscopy and superlenses are also viable possibilities.

A solar panel that doesn't have to track the sun yet focuses light into a beam that's always on target would save a lot of money on machinery.

Solar-generated power of all kinds would benefit, said Halas. "In solar cells, about 80 percent of the light passes right through the device. And there's a huge amount of interest in making cells as thin as possible for many reasons."

Halas said the thinner a cell gets, the more transparent it becomes. "So ways in which you can divert light into the active region of the device can be very useful. That's a direction that needs to be pursued," she said.

To make light-bending material, the Rice researchers spread polystyrene or latex colloidal particles on a glass slide, evaporate a layer of gold at various angles on top of the particles, deposit a layer of elastomer on top and then, after curing, lift the slab from the substrate with the oriented nanocups embedded.

March 12, 2009

Forbes Counts 332 fewer Billionaires

In 2009, the world's billionaires have an average net worth of $3 billion, down 23% in 12 months. The world now has 793 billionaires, down from 1,125 a year ago.

After slipping in recent years, the U.S. is regaining its dominance as a repository of wealth. Americans account for 44% of the money and 45% of the list's slots, up seven and three percentage points from last year, respectively.

New York despite being at the center of the financial crisis reclaims the top spot for having the most billionaires in the world despite the chaos on Wall Street and turmoil in real estate. Its wealthiest resident: Mayor Michael Bloomberg, who is worth $16 billion. They have 55 billionaires with an average net worth of $2.9 billion.

China Can Add Stimulus as Needed

China’s industrial-production growth slowed in the first two months of the year as exports slid at a record pace.

Output rose 3.8 percent in January and February from a year earlier, slowing from a 5.7 percent increase in December, the statistics bureau said today. New lending quadrupled in February to 1.07 trillion yuan from a year earlier, the central bank said.

“The export engine has died: China is in a ‘help themselves’ mode, pump-priming like crazy to increase fixed- asset investment and keep retail spending going,” said Joseph Tan, chief Asian economist at Credit Suisse Private Bank in Singapore. “I think they’re going to pull it off.”

Premier Wen Jiabao is aiming to achieve 8 percent economic growth this year through tax cuts and spending on roads, railways and houses.

China’s construction equipment sales may jump 20 percent in the second half as orders recover on the government stimulus, Lonking Holdings Ltd., the nation’s biggest maker of four- wheeled earthmovers, said this week.

China’s vehicle sales surged 25 percent in February after the government cut taxes on some models. General Motors Corp. has raised its forecast for the auto industry’s sales growth in China this year to as much as 10 percent from less than 3 percent.

“To trigger a full-scale recovery, the government needs to add more stimulus measures to boost consumer spending and the property sector,” said Frank Gong, head of China research at JPMorgan in Hong Kong.

China has “adequate ammunition” to revive the world’s third-biggest economy and can add to its 4 trillion yuan ($585 billion) stimulus package at any time, Premier Wen Jiabao said.

“They may expand the stimulus further, but the government’s still got to assess the effectiveness of the announced package,” said David Cohen, an economist at Action Economics in Singapore. “The world’s cheering them on as one of our few hopes.”

“At any time, we can introduce new stimulus,” Wen said. “We have reserved adequate ammunition.”

The National People’s Congress approved today a record budget deficit for this year to revive growth as revenue-growth slows. The central government’s planned 1.18 trillion yuan contribution to the stimulus package announced in November will all be “new” spending, Wen said. Of that amount, 595 billion yuan will be spent this year, he said.

Stimulus figures don’t include 600 billion yuan of tax cuts and 850 billion yuan of health costs, he said.

“As long as the government’s stimulus measures to boost domestic consumption are properly implemented, investment growth will continue to accelerate, making up for the loss of exports,” Ma Jiantang, the head of the statistics bureau said in Beijing today.

Li Yizhong, the Minister of Industry and Information Technology, said today that output growth was set to recover.

Besides the slump in exports, the government faces rising unemployment, falling house prices and the risk of an increase in soured loans. About 11 million migrant workers remain unemployed after returning to China’s cities after a Lunar New Year holiday in January, the government said this week.

Forbes provides more information from Premier Wen.

Materials to Scale Solar Power to Terawatt Levels

A new research paper evaluates material extraction costs and supply constraints for 23 promising semiconducting materials that could be used to afford ably scale solar power to terawatt levels. Iron pyrite, copper sulphide and copper oxide were at the top of the list of potential silicon and thin film replacements, with iron pyrite – more commonly known as Fool's Gold – deemed the leading candidate in terms of both cost and abundance.

Greatly increased penetration of photovoltaics into global energy markets requires an expansion in attention from designs of high-performance to those that can deliver significantly lower cost per kilowatt-hour. Twelve composite materials systems were found to have the capacity to meet or exceed the annual worldwide electricity consumption of 17000 TWh, of which nine have the potential for a significant cost reduction over crystalline silicon. We identify a large material extraction cost (cents/watt) gap between leading thin film materials and a number of unconventional solar cell candidates including FeS2, CuO, and Zn3P2. We find that devices performing below 10% power conversion efficiencies deliver the same lifetime energy output as those above 20% when a 3/4 material reduction is achieved. Here, we develop a roadmap emphasizing low-cost alternatives that could become a dominant new approach for photovoltaics research and deployment.

Eric Drexler points out that Pyrite (FeS2) is one of the best materials for scaling solar power and Pyrite is also an excellent material for molecular nanotechnology.

Biological examples show that protein molecules can guide crystal growth by selectively binding to crystal surfaces and surface features, and pyrite can grow under conditions that are compatible not just with proteins, but with living organisms. Development of a good crystal-shaping molecular toolkit could provide a route to a useful class of atomically precise fabrication techniques, and pyrite is an attractive target.

Magnetic resonance force microscopy

Magnetic resonance force microscopy (MRFM) is an imaging technique that acquires magnetic resonance images (MRI) at nanometer scales, and possibly at atomic scales in the future.

IBM and Stanford combined ultrasensitive magnetic resonance force microscopy (MRFM) with 3D image reconstruction to achieve magnetic resonance imaging (MRI) with resolution <10 nm. [4 nanometers]

The image reconstruction converts measured magnetic force data into a 3D map of nuclear spin density, taking advantage of the unique characteristics of the “resonant slice” that is projected outward from a nanoscale magnetic tip. The basic principles are demonstrated by imaging the 1H spin density within individual tobacco mosaic virus particles sitting on a nanometer-thick layer of adsorbed hydrocarbons. This result, which represents a 100 million-fold improvement in volume resolution over conventional MRI, demonstrates the potential of MRFM as a tool for 3D, elementally selective imaging on the nanometer scale.

Scientists from Stanford and IBM have improved the sensitivity of magnetic resonance imaging by 100 million times using a new technique for measuring tiny magnetic forces. The sensitivity improvement allowed a dramatic improvement of resolving power, achieving a resolution down to 4 nanometers (nm).

As for the future of MFRM technology, both researchers have high hopes. "I envision a machine, similar to the one we have today, in which we load a new protein or molecule, whose structure we don't know, every month or so," said Poggio. "In that time we make a detailed atomic-scale image of the protein or molecule.

"If we are able to obtain 10 to 100 times finer resolution, MRFM would truly represent a breakthrough for structural biologists."

MRFM is chemically selective, meaning that it can be used to look at a variety of elements to obtain a more detailed picture of the sample's structure. "For example, if you wanted to see where the DNA is, you can tune in to phosphorous," said Rugar.

Each scan takes several days and must be conducted in a super-cooled vacuum.

With further progress in resolution and sample preparation, force-detected MRI techniques could have significant impact on the imaging of nanoscale biological structures, even down to the scale of individual molecules. Achieving resolution <1 nm seems realistic because the current apparatus operates almost a factor of 10 away from the best demonstrated force sensitivities and field gradients.

Even with a resolution >1 nm, MRFM may allow the basic structure of large molecular assemblies to be elucidated. One can imagine enhancing MRFM image contrast beyond the basic spin-density information by using techniques similar to those developed for clinical MRI and NMR spectroscopy. Such contrast may include selective isotopic labeling (for example, substituting 1H with 2H), selective imaging of different chemical species (like 13C, 15N, or 31P), relaxation-weighted imaging, and spectroscopic imaging that reflects the local chemical environment. Some techniques, such as cross-polarization and depolarization between different nuclear spin species, have already been demonstrated for MRFM on the micrometer scale. At the nanometer scale, the ability to target and locate specific proteins although selective labeling, for example, could allow direct 3D imaging of the organization and structure of multicomponent macromolecular complexes. Such a capability would be complementary to current techniques, such as cryoelectron microscopy, and could develop into a powerful tool for structural biology.

The virus particles used as samples sit on an extremely flexible microscopic cantilever, which Rugar describes as a "little silicone diving board."

Then, using an oscillating magnetic field, researchers "flip the direction of the nuclear spin."

Bringing another tiny magnet close to the spinning particles will produce either an attractive or repulsive force that is measured by vibrations in the cantilever.

By measuring forces generated as the tiny magnet is positioned at 8,000 different points around the sample, scientists can generate a three-dimensional map of hydrogen density, thereby creating a three-dimensional image.

EMP Hand Grenades and Wireless Tasers

Electomagnetic Pulse Devices to disable unshielded electronics have been shrunk to hand grenade size. Below there is also the wireless taser. If you need to stop a robot or human, then the right weapon is available.

"EMP grenade technology is out there, but I've never had my hands on one," said Col. Laurie Buckhout, chief of the newly formed Electronic Warfare Division, Army Operations, Readiness and Mobilization, during a February 2009 bloggers roundtable from the Pentagon.

Hand grenade size units implies that there are larger EMP devices and High Power Microwave units.

Wireless Taser

The Taser XREP is an electrically charged dart that can be fired from up to 20 metres away with a 12-gauge shotgun. Upon impact, its barbed electrodes penetrate a victim's skin, discharging a 20-second burst of electricity to "distract, disorient and entice the subject to grab the projectile", says Taser.

Commercial production of the XREP is due to start later this month, with US police departments and the US military expected to be using the weapons by the end of 2009.

The Auto Assault-12 (AA-12) shotgun (originally designed and known as the Atchisson Assault Shotgun) seems to have a range of 175 meters for mini-grenades and would seem to be compatible with the Taser XRep.

Military Robots Can Follow Hand Signals

Brown University researchers have modified iRobots Packbot to recognise such standard hand/arm signals as "follow," "halt," "wait" and - of course - "door breach".

iRobots Warrior X700 Robot

40mm Grenade Launcher with ammo stored in barrel storage for rapid (all at once) firing. A 4 barrel version mounted on the iRobot Warrior X700 tracked "droid" has been demonstrated.

iRobot webpage for the warrior robot program.

Google Experimenting with Quantum Computers for Machine Learning

Hartmut Neven of Google is mapping important machine learning problems to D-Wave’s flavor of Adiabatic Quantum Computer.

Binary Classification is described at wikipedia

Binary classification is the task of classifying the members of a given set of objects into two groups on the basis of whether they have some property or not. Some typical binary classification tasks are:

* medical testing to determine if a patient has certain disease or not (the classification property is the disease)
* quality control in factories; i.e. deciding if a new product is good enough to be sold, or if it should be discarded (the classification property is being good enough)
* deciding whether a page or an article should be in the result set of a search or not (the classification property is the relevance of the article - typically the presence of a certain word in it)

The desired goals:
* Exponential speed-ups over classical approaches for certain typical-case NP-hard problems
* Could have large impact on fundamental machine learning problems as well

Training a Binary Classifier with the Quantum Adiabatic Algorithm Conclusions
* Global optimization competes successfully with greedy methods
* Bit-constrained learning machines often exhibit lower generalization error
* Intrinsic regularization
* Required bit precision grows only logarithmically with SN
* Training benefits from being treated as an integer program
* Good news for cell phones, sensor networks, early quantum chips
* Training problem manifestly NP-hard: motivates using AQC
* Next steps
* Experiment with 128-qubit D-Wave hardware
* Adaptive dictionaries
* Co-training of classifiers with feature sharing

Modifications to allow the application of the quantum adiabatic algorithm for Training a Binary Classifier

Implementation details

The Video Presentation

Link to video of presentation and powerpoint slides.

This paper describes how to make the problem of binary classification amenable to quantum computing. A formulation is employed in which the binary classifier is constructed as a thresholded linear superposition of a set of weak classifiers. The weights in the superposition are optimized in a learning process that strives to min- imize the training error as well as the number of weak classifiers used. No efficient solution to this problem is known. To bring it into a format that allows the applica- tion of adiabatic quantum computing (AQC), we first show that the bit-precision with which the weights need to be represented only grows logarithmically with the ratio of the number of training examples to the number of weak classifiers. This allows to effectively formulate the training process as a binary optimization prob- lem. Solving it with heuristic solvers such as tabu search, we find that the resulting classifier outperforms a widely used state-of-the-art method, AdaBoost, on a va- riety of benchmark problems. Moreover, we discovered the interesting fact that bit-constrained learning machines often exhibit lower generalization error rates. Changing the loss function that measures the training error from 0-1 loss to least squares maps the training to quadratic unconstrained binary optimization. This corresponds to the format required by D-Wave’s implementation of AQC. Simu- lations with heuristic solvers again yield results better than those obtained with boosting approaches. Since the resulting quadratic binary program is NP-hard, additional gains can be expected from applying the actual quantum processor.

0:00 Training a Binary Classifier with the Quantum Adiabatic Algorithm
0:05 Outline
0:25 Solving hard optimization problems using adiabatic quantum computing - 1
2:30 Solving hard optimization problems using adiabatic quantum computing - 2
4:07 The adiabatic theorem
5:46 Why bother using AQC?
6:06 D-Wave’s hardware - 1
7:22 D-Wave’s hardware - 2
9:13 Training a binary classifier
10:54 Hyperplanes
11:17 Hyperplanes in N = 3
12:42 Modifications I: reduce bits to represent weights
14:01 Modifications II: quadratic loss
14:57 Implementation details: dictionaries
15:48 Implementation details: classical optimization methods
17:19 Experiments I: Synthetic data
18:00 Results for synthetic data with order 1 dictionary
19:27 Results for synthetic data with order 2 dictionary
19:41 Experiments II: Natural data
20:10 Conclusions

Machine learning at wikipedia

The research paper for "Training a Binary Classifier with the Quantum Adiabatic Algorithm"

Brain Scan Mind Reading of Spatial Information Progress

New Scientist reports : Scans of the part of the brain responsible for memory have for the first time been used to detect a person's location in a virtual environment. Using functional MRI (fMRI), researchers decoded the approximate location of several people as they navigated through virtual rooms. The work is continuing to use scans and more precise scans to discern what someone was doing, where they are or were and where they plan to go.

This finding suggests that more detailed mind-reading, such detecting as memories of a summer holiday, might eventually be possible, says Eleanor Maguire, a neuroscientist at University College London.

"This is a very interesting case because it was previously believed impossible to decode [spatial] information," says John-Dylan Haynes, a neuroscientist at the Bernstein Center for Computational Neuroscience in Berlin, Germany.

"There must be some hidden structure in the spatial organisation of cells with activity related to each of the places in the environment," agrees Edvard Moser, a neuroscientist at the Norwegian University of Science and Technology in Trondheim.

The abstracgt from the Cell Journal

Decoding Neuronal Ensembles in the Human Hippocampus
The hippocampus underpins our ability to navigate, to form and recollect memories, and to imagine future experiences. How activity across millions of hippocampal neurons supports these functions is a fundamental question in neuroscience, wherein the size, sparseness, and organization of the hippocampal neural code are debated.Here, by using multivariate pattern classification and high spatial resolution functional MRI, we decoded activity across the population of neurons in the human medial temporal lobe while participants navigated in a virtual reality environment. Remarkably, we could accurately predict the position of an individual within this environment solely from the pattern of activity in his hippocampus even when visual input and task were held constant. Moreover, we observed a dissociation between responses in the hippocampus and parahippocampal gyrus, suggesting that they play differing roles in navigation.These results show that highly abstracted representations of space are expressed in the human hippocampus. Furthermore, our findings have implications for understanding the hippocampal population code and suggest that, contrary to current consensus, neuronal ensembles representing place memories must be large and have an anisotropic structure.

Reading more precise locations or other kinds of memories could be difficult, because fMRI resolves the activity of thousands of neurons at a time, Haynes says. "One day a new imaging technique could come along and you'd be at the right place to decode even in these challenging cases," he adds.

However, Maguire isn't waiting for new technologies. Her team is already looking into the possibility of reading more vivid memories of events and planned movements. "We've done some work about how the hippocampus is involved in planning the future – where you're going and what you're doing."

Here is a pdf with 17 pages of supplemental information to the research paper.

Previously brain scanning had revealed what letters a person was reading.

March 11, 2009

Chinas Trade and Other Economic Numbers and the Future of World Steel

Seeking Alpha analyses China trade and inflation numbers.

According to an article in Tuesday’s Financial Times:

China will reduce export taxes to zero and give more financial support to exporters as it tries to increase its share of global trade in the current crisis, the country’s commerce minister announced on Monday. China would “use all possible measures to ensure the stable growth of our exports and prevent a large drop in external demand”, Chen Deming said in an interview published by a Communist party newspaper. “We should increase our share of the global market… We must transform ourselves from a big export nation to a strong export nation,” he continued.

According to an article in Bloomberg:

China’s investment spending surged as the nation poured money into roads, railways and power grids to counter a plunge in exports, which a separate report showed fell by a record in February. Urban fixed-asset investment climbed a more-than-estimated 26.5 percent in January and February combined to 1.03 trillion yuan ($150 billion) from a year earlier, the statistics bureau said today in Beijing.

More unambiguously good news involved February car sales, which are up substantially and suggest that some government policies are getting consumers to go back to buying cars, although this was accompanied by bad numbers on car exports.

The mainland’s sales of domestically made vehicles surged 25 per cent in February from a year earlier, as a tax cut for small cars and other measures helped revive the market, an industry group said on Wednesday. February’s sales totalled 827,600 units, up 12 per cent from the 735,000 sold in January, the China Association of Automobile Manufacturers said in a report posted on its website. Production in February totalled 807,900 units, up about 23 per cent from the year before, it said.

…However, despite the apparent rebound in China’s own car market, a slump in demand is crimping sales overseas: exports in January fell 33.5 per cent from a year earlier, to US$2.66 billion, the group said. The impact was most severe for domestic-brand cars, with January exports falling 64 per cent from a year earlier to 16,300 units, it said. Imports of vehicles also took a hit amid the deepening economic downturn, falling 20.3 per cent from a year earlier in January to US$1.73 billion, it said.

Economists said they believe China would be able to keep its growth at about 8 percent this year, a growth rate long believed to be minimum to create enough jobs and maintain social stability. However, they said it is wild wish to count on the country alone to fuel the global recovery, as China’s economy accounted for only five percent of the world’s total.

Steel Production

From the world steel website.

A 2008 analyst forecasts China's share of world steel to go from 32% in 2006 to 35% in 2013.

Price Waterhouse Coopers has a World Steel Forecast out to 2020 made after the credit crisis hit.

Key Barriers to the 2020 Projection on Steel
1. War of mining assets
2. Continuous shortage of funding capital
3. Shortage of Energy Supply in China
4. Tightening regulations (environmental)

Car Production Forecast
International Herald Tribune: Feb, 2009. Nissan pins hopes of growth in China

An analyst report on China's automotive future.

China’s automotive industry exhibited the first signs of a slowdown from its rapid double-digit growth of earlier years in 2008. According to BMI’s recently published China Automotives Report, the sales milestone of 10 million units, which had initially been forecast for the year, was not met after sales began to feel the full impact of the financial crisis in the latter months of the year. In November overall market sales fell by 16%, led by a 26% contraction in the previously strong commercial vehicle segment and a 10% drop in passenger car sales. BMI estimates that sales finished the year up by just 7% and forecasts a similarly difficult year in 2009 with growth dropping to around 5%. We now expect it will be 2010 before the market reaches the 10million units mark.

In a further blow to the industry, the World Trade Organisation (WTO) upheld a ruling from July that declared China’s tariffs on imported automotive parts a contravention of membership regulations. China currently charges the same import tariff on kits as on completely built units (CBUs), unless 60% or more of the content in a completed vehicle is domestic content.

Some project that the United States automotive market will grow again in 2010

World Economic Forecast
Euromonitor forecasts the developing world overtaking advanced economies in 2013 on a purchasing power parity basis.

As developing economies overtake advanced economies, consumer markets in the emerging countries will rise in importance. From tourism to household appliances, consumer goods and services companies are expected to shift their attention to new consumers.

Brain Performance Enhancement Targets

From New Scientist: Intelligence is strongly genetic, that doesn't mean it cannot be improved. "It's just the opposite," says Richard Haier, of the University of California, Irvine, who works with Thompson. "If it's genetic, it's biochemical, and we have all kinds of ways of influencing biochemistry."

Myelin integrity is an especially promising target for manipulation, because, unlike the volume of grey matter, it changes throughout life. That it can change may seem surprising given its heritability. One explanation is that genes drive us to interact with our environment in ways that can lead to changes in myelin integrity, says Thompson.

Identifying the genes that promote high-integrity myelin could lead to ways to enhance the genes' activity or artificially add the proteins they code for. This may in turn provide therapies for multiple sclerosis, autism and attention deficit disorder, which are associated with degraded myelin. Intelligence enhancement in people who just want help passing an exam, say, is also "within the realm of possibility", Thompson reckons.

Medical treatments are still a long way off, warns Naomi Friedman, a behavioural geneticist at the University of Colorado in Boulder: "There'll be interactions between genes and environment that are going to have to be disentangled."

Caltech: Lesion Mapping of Cognitive Abilities Linked to Intelligence

Here is a 12 page pdf with supplemental information to the research paper on "Lesion Mapping of Cognitive Abilities Linked to Intelligence". They identified the parts of the brain that are used for verbal comprehension, perceptual organization, working memory and processing speed.

The Wechsler Adult Intelligence Scale (WAIS) assesses a wide range of cognitive abilities and impairments. Factor analyses have documented four underlying indices that jointly comprise intelligence as assessed with the WAIS: verbal comprehension (VCI), perceptual organization (POI), working memory (WMI), and processing speed (PSI). We usednonparametric voxel-based lesion-symptom mapping in 241 patients with focal brain damage to investigate their neural underpinnings. Statistically significant lesion-deficit relationships were found in left inferior frontal cortex for VCI, in left frontal and parietal cortex for WMI, and in right parietal cortex for POI. There was no reliable single localization for PSI. Statistical power maps and cross-validation analyses quantified specificity and sensitivity of theindex scores in predicting lesion locations. Our findings provide comprehensive lesion maps of intelligence factors, and make specific recommendations for interpretation and application of the WAIS to the study of intelligence in health and disease.

Brain Map has Almost Millimeter Cube Volume Precision
43 patients underwent computerized axial tomography (CT) scans and for 198 patients highresolution anatomical T1 weighted images were acquired on a 1.5 T General Electric Signa scanner with a 3D SPGR sequence. A total of 124 coronal slices were acquired (in-plane resolution 1x1 mm, inter-slice distance 1.6 mm, field of view 24 cm).

They statistically correlated how much each piece (voxel) of brain in each specific location correlates to intelligence.

CRNano Chris Phoenix and MetaModern Eric Drexler Roundup

CRNano Increased Technical Content With Chris Phoenix
Chris Phoenix describes robotic control of molecular manufacturing without computation.

In high-volume molecular manufacturing, computers can't be used to control robots handling individual molecules, because computation requires too much power. However, computer-like control can be achieved without the use of computers, which means that robots can still be used where appropriate.

For some operations, it will be suitable to build single-purpose machines that can only do one thing. But for other operations, using externally controllable machines - robots - will make the nanofactory smaller and faster to build, and more flexible to use. A nanofactory using robots can easily build products of greater mechanical complexity than the nanofactory itself, and can build products that weren't designed when the nanofactory was designed and built.

But - and this is a key point - the controller does not have to compute those instructions in real time. The instructions can be a pre-compiled recipe. Do this set of 5,000 steps, and you get a cubic nanometer of diamond; do that set of 7,500 steps, and you get a nanometer of carbon nanotube. The lists of steps can be computed when the product is designed, and only copied from place to place when it is manufactured.

It takes energy to compute the lists of operations [for a nanofactory process] in the first place. But nano-built products will be highly repetitious; a recipe for a cubic nanometer of diamond will be re-used many times. The number of bits in a blueprint for a product may be vanishingly small compared to the number of atoms, and still specify the exact position of each atom in the product.

Chris Phoenix points out to blame the second law of thermodynamics (or any of the other physical laws) for the errors that creep into modern-day semiconductor fabrication is simply incorrect. Semiconductor fab errors are due to limitations in technology, not physics.

A Space Elevator update from a CRNano reader Tom Huffman.

Eric Drexler's Recent Technical Articles
Eric Drexler discusses computer aided design for structural DNA and protein nanotechnology.

Eric Drexler describes his dislike of the term nanobot.

A fifth article in the series on nanomaterials.

Are soft and hard machines at odds with each other? Surely not. Soft biomolecules and hard inorganic solids have worked together since a bacterium first succeeded in gluing itself to a mineral grain, and perhaps long before, at the origin of life itself. There is no gap between soft and hard nanomachines: The technologies form a continuum, and working together, they can form a bridge.

All the materials shown above are found in nature (ignoring the thorium in cerianite, which is mostly cerium dioxide). All but one, diamond, can be synthesized in water, at atmospheric pressure, near room temperature. Pyrite (“fool’s gold”) is often a product of biomineralization, and bacteria can synthesize magnetite as nanocrystals of controlled size and shape. Polymeric blocks with mechanical properties comparable to keratin could consist of any of a wide range of engineered proteins or other foldamers, and as can be seen, the value of Klm shifts from worst to best with a factor of 3 increase in block size.

As we explore implementation pathways to that lead toward advanced nanotechologies, it’s important to keep in mind that conditions for forming pyrite (and a range of other hard, inorganic materials) are compatible with soft-material technologies; these include macromolecular templates, crystal-growth promoters and inhibitors, and surface-binding molecules with diverse functions. Continued progress in engineering interfaces between macromolecules and inorganic crystals will be of critical importance.

The fourth article on nanomaterials which discusses lattice-scaled stiffness.

The Early Metamodern Nanotechnology Series
1. Modular Molecular Composite Nanosystems
Biomolecular engineering for atomically precise nanosystems

2. Toward Advanced Nanotechnology: Nanomaterials (1)
Why I’ve never advocated starting with diamond

3. Toward Advanced Nanotechnology: Nanomaterials (2)
Stiffness matters (and protein isn’t remotely like meat)

4. Self-Assembly for Nanotechnology
The virtues of self-assembly and the benefits of external guidance

5. From Self-Assembly to Mechanosynthesis
Mechanosynthesis begins with soft machines

6. Toward Advanced Nanotechnology: Nanomaterials (3)
Mechanical engineering meets thermal fluctuations

Eric Drexler's Articles with Videos of Nanomanufacturing
High-Throughput Nanomanufacturing: Small Parts (with videos)

High-Throughput Nanomanufacturing: Assembly (with videos)

High-Throughput Nanomanufacturing:
Assembling larger products (with videos)

March 10, 2009

Chinas 2008 and Future Economy

China ended 2008 with a GDP of 30.1 trillion yuan, which is $4.4 trillion at current exchange rates 6.84 to 1 USD.

Japan’s GDP for 2008 was 510.2 trillion yen [@98 yen to 1 USD this is $5.2 trillion.]

Japan's GDP dropped at an annual pace of 12.7 percent in the October-December, 2008 period.

Japan is likely to have about -3.8% or more negative GDP contraction. If China has 7% GDP growth and China exchange rate stays constant with the US dollar then China will pass Japan if the exchange rate goes to about 105 yen to the US dollar.

Year GDP(yuan) GDP growth USD/CNY China GDP China+HK/Ma US GDP
2007 25.77 13 7.3 3.77 3.97 13.8 Past Germany
2008 30.1 9 6.85 4.4 4.6 14.3
2009 32.2 7 6.8 4.7 4.9 14.0 Passing Japan
2010 35.1 9 6.2 5.7 5.9 14.1 Past Japan
2011 38 9 5.6 6.8 7.0 14.6
2012 41 9 5.1 8.0 8.2 15.0
2013 45 9 4.7 9.6 9.8 15.4
2014 49 9 4.2 11.7 11.9 16.0
2015 53 9 3.8 14.0 14.3 16.6
2016 58 9 3.5 16.6 16.9 17.2
2017 62 8 3.2 19.4 19.7 17.8 Past USA
2018 67 8 3 22.3 22.6 18.5
2019 73 8 3 24.3 24.6 19.2
2020 78 8 3 26.2 26.5 19.9
2021 85 8 3 28.2 28.5 20.6
2022 90 8 3 30.4 30.7 21.4
2023 97 8 3 32.7 33.0 22.1
2024 105 8 3 35.3 35.6 23.0
2025 113 8 3 38.0 38.3 23.8
2026 121 7 3 40.6 40.9 24.7
2027 129 7 3 43.4 46.3 25.6
2028 139 7 3 46.3 45.6 26.5
2029 147 7 3 49.5 49.8 27.5
2030 158 7 3 52.9 53.3 28.4

Some analysis of China's 2009 economic situation. China fiscal deficit would be 2.92% of GDP. In comparison, the US fiscal deficit for 2009 would take up 12.5% of its GDP. China has room to increase stimulus while staying near reasonable deficit numbers.

Foreign Policy looks at China's numerological obsession with 8% growth.

Seeking alpha has an article that the yen could be too high.

China passes Germany to become 3rd largest economy

Previous post start of credit crisis assessment

Comparison of historic GDP numbers between China and the USA

China Launching Tiangong Space Station at the end of 2010

If everything goes smoothly, China will launch Tiangong-1 at the Jiuquan Satellite Launch Center at the end of 2010, said Zhang Jianqi, Deputy Commander-in-Chief of China's Manned Space Engineering Program. reports that China is openly acknowledging that the new Tiangong outpost will involve military space operations and technology development.

Tiangong-1, weighing about 8.5 tons, has a support module and an experiment module which can carry much heavier loads than the Shenzhou spacecraft series. Tiangong-1 is also equipped with a spacecraft docking system.

Tiangong-1 is a target spacecraft China is developing for the next step in its space program-- the construction of a space station.

Zhang also disclosed that after the successful launch of "Tiangong-1," China will successively launch the Shenzhou-8, Shenzhou-9 and Shenzhou-10 spacecrafts to meet and dock with "Tiangong-1."

Its main mission will be to serve as the target for carrying out space rendezvous and docking experiments, to guarantee the working and living conditions of taikonauts as well as their safety during their short-term parking orbit. It will also carry out space application and aerospace medical experiments, space science experiments and technical testing of the space station, and basically establish a space experiment platform that can carry out short-term manned missions and long-term independent and reliably-operated unmanned missions.

"After completing the above tasks, we will start the third step, in which we will go all out to build a long-term manned space station by 2020." Zhang said, after that, "Tiangong-1" will be upgraded to a cargo spaceship. The cargo spaceship will not only have rendezvous and docking functions, it will provide refueling for the space station. The first launch of the cargo spaceship will take place at Hainan's Wenchang Satellite Launch Center.

Along with the Tiangong announcement comes another major revelation — that China now has two manned space station programs under development.

• The new Tiangong series, that can be launched on the same type Long March 2F booster used to carry Soyuz-type Shenzhou manned transports.

• And a larger 20-25 ton "Mir class" station that will follow by about 2020 launched on the new oxygen/hydrogen powered Long March 5 boosters.

China plans to land a nuclear powered unmanned lunar rover by 2012-2013 followed by an unmanned sample return mission about 2017.

In 2010-2011, before the rover and sample return missions are flown a Chinese-technology mission may be sent to the Moon to further demonstrate landing technologies. But the Chinese were not clear on whether it would go all the way to the surface.

Masdar City: Pilot Version of the City of Future Still Proceeding

A 49 page pdf from January 2009 shows the vision of Masdar City. A city that will put together, gather real data on a collection of technologies for a more environmentally friendly and efficient city of the future. It will be more efficient with energy. It will be 100,000 people only using personal rapid transit, light rapid transit, segways, bicycles and walking. It is to be completed in 2016. Follow up cities can learn from Masdar to make other cities even more efficient and cost effective based on the sensors that Masdar will have to record the actual performance of technologies and systems that are used. China, India and other countries will continue to urbanize. China is building new cities or expanding old ones to add 20 to 30 million people per year. China is making its own eco-cities (although some have been put on hold with the recent financial problems). A success at Masdar could be improved and repeated thousands of times from 2015 to 2030+. [click on pictures for larger version]

"If environmental engineers, by gaining experience from building this wild city, become much more productive at building the next city, this starts to move from being science fiction to something Houston would adopt," says Matthew Kahn, a professor of economics at the University of California, Los Angeles. Gil Friend, CEO of Natural Logic, a sustainable-design company based in Berkeley, CA, agrees. "I see Masdar on the one hand as a playground for the rich," he says, "and on the other hand as an R&D opportunity to deploy and test out technology that, if things go well, will show up in other cities."

MIT Technology Review's look at Masdar City

The construction is the start of a vast experiment, an attempt to create the world's first car-free, zero-carbon-dioxide-emissions, zero-waste city. Due to be completed in 2016, the city is the centerpiece of the Masdar Initiative, a $15 billion investment by the government of Abu Dhabi, which is part of the United Arab Emirates.

The development of Abu Dhabi over the last few decades has reflected a frenetic effort to catch up with the developed world. Now, because of projects such as Masdar City, the emirate has a chance to race ahead.

If the Masdar project doesn't justify itself financially, it could indeed be just a green playground for the rich, an environmental theme park that is largely irrelevant for the development of sustainable technology on a broader scale. But if it is profitable, it could be a driving force for sustainable urban design. Then the oil-rich developers in the UAE and elsewhere might have a reason to build more green cities and skip constructing another ski slope in the desert. And developers worldwide will follow.

More Technical Details on Artificial Ribosome

Reconstituting ribosomes: Shown here is a parts list for creating a synthetic, self-replicating ribosome. Proteins are shown in purple, RNA in red, and DNA in blue. The list includes 54 ribosomal proteins, as well as RNA-based enzymes involved in protein production, and other molecules that interact with ribosomes.
Credit: George Church and Mike Jewett

Church and his team also want to use make modified ribosomes to make a new class of proteins--those that are the mirror image of the proteins found in nature. Proteins and many other molecules have a "handedness," or chirality, to their structure. Amino acids made in nature are almost exclusively left-handed. And just as a glove fits on only one hand, left-handed enzymes can only catalyze reactions of substrates with the correct handedness. This means that mirror-image molecules would be resistant to breakdown by regular enzymes, says Church. That could have important industrial applications, generating long-lasting enzymes for biofermentation, used to create biofuels and other products.

Eventually, says Church, he wants to create tiny protein factories out of tailor-made ribosomes. "We want to make large amounts of special proteins that are hard to make in vivo, and are useful for vaccine production [and other purposes]."

Next, the researchers want to create a ribosome that can re-create itself. They have compiled a list of 151 genes that they think are needed for a self-reproducing ribosome, including genes for ribosomal proteins, different types of RNAs, enzymes that catalyze different reactions in protein synthesis, and additional genes not directly related to the ribosome. "We think this is enough genes to replicate DNA, produce RNA and ribosomes, and have a primitive membrane," says Church."Once you get it going, it should be able to keep going if you supply it with amino acids and nucleotides [the building blocks of DNA and RNA]."

Once they get the system up and running, the researchers hope to genetically optimize it into an efficient protein factory. Protein products, such as biologic drugs, are now mostly made in vats of bacteria. "When you make proteins in live bacteria, you throw away 90 percent of the bacterial biomass just to get a few grams of protein," says David Deamer, a chemist at the University of California, Santa Cruz. "If you could do it without live organisms, it could be much more efficient."