Lockheed Martin and scientists at USC think about D-Wave Quantum Computer technology.
Here is a quote from Greg Tallant, Research Engineering Manager, Flight Control & VMS Integration – FW, Advanced Development Programs, Lockheed Martin Aeronautics Company in the video -
It’s a game changer for the corporation, it’s a game changer for our customers, and ultimately it’s a game changer for humanity. Computationally this is the equivalent of the Wright brothers at Kitty Hawk.
Steve Jurvetson (venture capitalist funder of Dwave Systems) explained what doulbing the qubits in a scalable quantum computing architecture would mean.
So, how do we read the graph above? Like Moore’s Law, a straight line describes an exponential. But unlike Moore’s Law, the computational power of the quantum computer should grow exponentially with the number of entangled qubits as well. It’s like Moore’s Law compounded. (D-Wave just put together an animated visual of each processor generation in this video, bringing us to the present day.)
And now, it gets mind bending. If we suspend disbelief for a moment, and use D-Wave’s early data on processing power scaling (more on that below), then the very near future should be the watershed moment, where quantum computers surpass conventional computers and never look back. Moore’s Law cannot catch up. A year later, it outperforms all computers on Earth combined. Double qubits again the following year, and it outperforms the universe. What the???? you may ask... Meaning, it could solve certain problems that could not be solved by any non-quantum computer, even if the entire mass and energy of the universe was at its disposal and molded into the best possible computer.
It is a completely different way to compute — as David Deutsch posits — harnessing the refractive echoes of many trillions of parallel universes to perform a computation.
First the caveat (the text in white letters on the graph). D-Wave has not built a general-purpose quantum computer. Think of it as an application-specific processor, tuned to perform one task — solving discrete optimization problems. This happens to map to many real world applications, from finance to molecular modeling to machine learning, but it is not going to change our current personal computing tasks. In the near term, assume it will apply to scientific supercomputing tasks and commercial optimization tasks where a heuristic may suffice today, and perhaps it will be lurking in the shadows of an Internet giant’s data center improving image recognition and other forms of near-AI magic. In most cases, the quantum computer would be an accelerating coprocessor to a classical compute cluster.
Second, the assumptions. There is a lot of room for surprises in the next three years. Do they hit a scaling wall or discover a heretofore unknown fracturing of the physics… perhaps finding local entanglement, noise, or some other technical hitch that might not loom large at small scales, but grows exponentially as a problem just as the theoretical performance grows exponentially with scale. I think the risk is less likely to lie in the steady qubit march, which has held true for a decade now, but in the relationship of qubit count to performance.
If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks