Classical computers can be optimized but Quantum computers will get faster

In 2008, Scott Aaronson said

Even if D-Wave managed to build (say) a coherent 1,024-qubit machine satisfying all of its design specs, it’s not obvious it would outperform a classical computer on any problem of practical interest. This is true both because of the inherent limitations of the adiabatic algorithm, and because of specific concerns about the Ising spin graph problem. On the other hand, it’s also not obvious that such a machine wouldn’t outperform a classical computer on some practical problems. The experiment would be an interesting one! Of course, this uncertainty — combined with the more immediate uncertainties about whether D-Wave can build such a machine at all, and indeed, about whether they can even produce two-qubit entanglement — also means that any talk of “lining up customers” is comically premature.

— The entanglement evidence continues to accumulate.
— They sold two $10 million machines.
— the Dwave system speed improved by 3,000 to 500,000 times from the 128 qubit system to the 512 qubit system. This was a three year gap. More on the speedup question and optimization of classical computers and algorithms.

My own prediction from 2006. (two years before what Scott said)

There will be a quantum computer with over 100 qubits of processing capability sold either as a hardware system or whose use is made available as a commercial service by Dec 31, 2010.

This was absolutely correct as a 128 qubit machine was sold in 2010 to Lockheed for $10 million.

Speedup question

Previously solutions were slower on the 128 qubit when compared against general commercial algorithms on a regular workstation. Now the 512 qubit system is tens of thousands of times faster than those systems. Classical systems are sped up with custom algorithms and can be sped up with hardware versions of algorithms.

— the 512 qubit system was 3 to 5.6 times faster than the 439 qubit system. If the speedup scaling (for 16% more qubits) were to hold to 2048 then the speedup would be 60,000 to 30 million times faster
— Dwave has the funding, sales and future sales to afford to build the 2048 qubit system and likely an 8192 qubit system.
— There are some claims that Dwave will not get the speedup unless they incorporate more error correction into their design. Dwave is constantly working on their hardware designs and quantum algorithms and have a staff of about one hundred working on those issues and collaborate with dozens of other top academic research groups. Dwave is collaborating with academic researchers. If the the full speedup, more speedup or most of future speedup has to include redesigns, that is part of the process. It is all good so long as the improvement happens and larger useful problems can be solved.

BTW – if Dwave has to take the advice of one or many of the researchers to get more speedup and better results for their system and this ends up enabling Dwave to make a lot more money and IPO, then of course that would be a win for Dwave. It would be a win for those academics who provided that advice and massively superior quantum computers with big speedups over any classical systems would be a win for humanity.

Most of the other competing quantum computing solutions are experimenting with 1 or 2 qubits.

More or different benefits could result from other kinds of quantum computing like
* photonic methods
* nitrogen diamond vacancies
* quantum dots
* trapped ion
* one of the many other methods

We should eventually see what all of those methods are capable of doing. There were many kinds of classical computing hardware. Vacuum tubes, integrated circuits, analog hardware, asics, GPUs, DSP (digital signal processing) etc… Different kinds of processing hardware and many kinds of memory hardware. Multiple kinds of each being used at every stage. Information technology is a multi-trillion part of the economy (Global IT spending is about $3.8 trillion in 2013). Computer hardware is a big chunk of that.

Note -Almost all the global IT business is not using highly optimized solutions. They are using general database, spreadsheet and other software. The standard practice is to throw away million fold CPU cycles to save people time. Most IT people give no thought to tuning database queries. Try to make something work and tune only if you have to.

Many early computer hardware companies went out of business and were displaced.
Companies with new classical computer hardware (like when GPUs emerged) have to find some set of problems that they provide a superior solution. This will be the same with all of the quantum computing systems.
Multiple quantum computer solutions can exist at the same time by serving different niches.
More and better work on all kinds of quantum computing algorithms and problems will be beneficial to all kinds of quantum computing systems.

There is very interesting quantum dot work in Alberta and Australia. I think some commercial scale systems should result from that work over the next 3 to 6 years.

Solving problems and utility question

The Dwave system has been used to run Google algorithms that clean up dirty data for image classification. Can the optimized and manually tuned classical systems perform the imaging data set cleanup better and provide superior practical ease of use and performance benefits better than the Dwave system ?

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks