Dwave Superconducting Quantum Computer passes more rigorous tests that confirm Quantumness of the System

The USC Viterbi School of Engineering is home to the USC-Lockheed Martin Quantum Computing Center (QCC), a super-cooled, magnetically shielded facility specially built to house the first commercially available quantum computing processors – devices so advanced that there are only two in use outside the Canadian lab where they were built: The first one went to USC and Lockheed Martin, and the second to NASA and Google.

Since USC’s facility opened in October 2011, a key task for researchers has been to determine whether D-Wave processors operate as hoped – using the special laws of quantum mechanics to offer potentially higher-speed processing, instead of operating in a classical, traditional way.

An international collaboration of scientists has now published several papers rejecting classical models of the first-generation D-Wave One processor housed at USC, including one on an elaborate test of all 108 of the chip’s functional quantum bits (“qubits”). The test demonstrates that the D-Wave One behaved in a way that agrees with a model called “quantum Monte Carlo,” yet disagreed with two candidate classical models that could have described the processor in the absence of quantum effects.

Nature Physics – Evidence for quantum annealing with more than one hundred qubits

“The challenge is that the tests we can perform on the USC-based D-Wave processor can’t directly ‘prove’ that the D-Wave processor is quantum – we can only disprove candidate classical models one at a time,” said QCC Director Prof. Daniel Lidar. “But so far we find that the D-Wave processor is always consistent with our quantum models. Our tests continually get more rigorous and complex.”

Add this to recent work involving USC Information Sciences Institute researcher Federico Spedalieri demonstrating entanglement in a chip at the company’s headquarters in Burnaby BC as well as previous testing of a smaller group of qubits by Spedalieri, Lidar and their collaborators, and the evidence is mounting that quantum effects are at play in the D-Wave processors.

Quantum processors encode data in qubits, which have the capability of representing the two digits of one and zero at the same time – as opposed to traditional bits, which can encode distinctly either a one or a zero. This property, called superposition, along with the ability of quantum states to “interfere” (cancel or reinforce each other like waves in a pond) and “tunnel” through energy barriers, is what may one day allow quantum processors ultimately perform optimization calculations much faster than traditional processors.

Optimization problems can take many forms, and quantum processors have been theorized to be useful for a variety of big data problems like stock portfolio optimization, image recognition and classification, and detecting anomalies, such as rooting out bugs in complex software.

The first quantum chip housed at the QCC was a 128-qubit D-Wave One, which was replaced about a year ago with the 512-qubit D-Wave Two. Though every chip is unique, the repeated validation of the older chip bodes well for its successor, which shares the same architecture.

“Our work is part of a large scale effort by the research community aimed at validating the potential of quantum information processing, which we all hope might one day surpass its classical counterparts,” Lidar said.

ABSTRACT

Quantum technology is maturing to the point where quantum devices, such as quantum communication systems, quantum random number generators and quantum simulators may be built with capabilities exceeding classical computers. A quantum annealer, in particular, solves optimization problems by evolving a known initial configuration at non-zero temperature towards the ground state of a Hamiltonian encoding a given problem. Here, we present results from tests on a 108 qubit D-Wave One device based on superconducting flux qubits. By studying correlations we find that the device performance is inconsistent with classical annealing or that it is governed by classical spin dynamics. In contrast, we find that the device correlates well with simulated quantum annealing. We find further evidence for quantum annealing in the form of small-gap avoided level crossings characterizing the hard problems. To assess the computational power of the device we compare it against optimized classical algorithms.

20 pages of supplemental material

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks