Quantum Machine Learning — Dr. Scott Aaronson’s “Read the Fine Print”
TL;DR — The task is to figure out how to look at problems in a way that would fit them into quantum algorithms for speedup. This does not mean that this is simple at all, or possible for all problems, due to caveats that will not give the full anticipated speedup. This is the “fine print” of quantum machine learning.
Recently, Dr. Scott Aaronson, a professor at MIT, published an essay about quantum machine learning. (Thanks to Nature for participating in the #OpenScience initiative, to allow everyone to read this article). It’s a great, approachable read to the challenges behind the application of quantum algorithms. Shor’s algorithm, breaking RSA cryptography, is the “killer app” that everyone knows, but the applications of quantum computing extend so far beyond that. Breaking encryption is the least interesting thing a quantum computer can do.
Quantum Machine Learning
Aaronson focuses on a class of algorithms called the Harrow-Hassidim-Llyod (HHL) algorithms, just discovered in 2008, which has become the template for additional classes of machine learning algorithms; the paper is a fantastic review of the specific workings of the algorithm and how it can be applied to machine learning. In general, HHL will give (at best) an exponential speedup. Questions beyond just the mathematics behind the algorithm will affect the eventual applicability, like hardware limits of the quantum computer, or user/problem settings, might reduce speedup to something more modest.
The caveats Aaronson outlines for HHL:
- the existence of a ‘quantum RAM’ that will allow the vector that is analyzed for HHL to be efficiently loaded
- quantum computer has to be able to apply unitary transformations to the matrix
- the matrix values need to be fairly uniform to make the system robustly invertible
- carefully choosing the read out data so that calculation does not have to be repeated n times
None of these caveats are killers, as of now, but they do pose a significant challenge.
To whatever extent we care about quantum computing at all, I’d say we should be excited indeed: HHL and the later algorithms represent real advances in the theory of quantum algorithms, and in a world with quantum computers, they would probably find practical uses.
-Dr. Scott Aaronson
The task is to figure out how to look at problems in a way that would fit them into a known quantum algorithm for speedup.
This does not mean that this is simple at all, or possible for all problems.
A quantum computer will most likely never be a standalone system; it will be integrated with a supercomputer and run quantum routines on the quantum processor for subsets of problems that have a speedup advantage.
We do not get a universal speedup for all problems, or even a guaranteed exponential speedup.
That means it is fine is the computer can’t have full speedup for all problems or all steps of a problem, but can just outsource the tasks that are more efficient on a quantum computer to the quantum processor.
A big positive of articles like this is explicitly connecting the quantum algorithm world with “real world” applications. Another implication of these studies is that there’s so much to still be discovered, and quantum physicists and computer scientists need to find a common language in order to work together to create killer apps for a quantum computer, once the hardware exists.
One thing I keep reiterating is we don’t know how far the impact of quantum computing will reach. We can’t know. In the 1800s, Lord Kelvin declared that, okay, now we are finally done learning all of physics. Then Einstein discovered the theory of relativity and quantum mechanics.
The substructure of the universe regresses infinitely towards smaller and smaller components. Behind atoms we find electrons, and behind electrons, quarks. Each layer unraveled reveals new secrets, but also new mysteries.
Academician Prokhor Zakharov, “For I Have Tasted the Fruit”
Here’s the original paper by Harrow, Hassimid, and Lloyd detailing their HHL “quantum machine learning” algorithm.