The different forms of quantum computing skepticism

(see also pdf version)

 

Quantum computing is one of the most exciting developments of computer science in the last decades. But this concept is not without its critics, often known as “quantum computing skeptics” or “skeptics” for short. The debate on quantum computing can sometimes confuse the physical and mathematical aspects of this question, and so in this essay I try to clarify those. Following Impagliazzo’s classic essay, I will give names to scenarios or “potential worlds” in which certain physical or mathematical conditions apply.

Potential worlds

Superiorita is the world where it is feasible to build scalable quantum computers, and these computers have exponential advantage over classical computers. That is, in superiorita there is no fundamental physical roadblock to building large quantum computers, and hence the class BQP is a good model of computation that is physically realizable. More precisely, in superioriata the amount of resources (think dollars) that is required in order to simulate a T-gate quantum circuit grows at most polynomially or maybe even linearly (with not-too-terrible constants) in T.

The other aspect of Superiorita is the mathematical conjecture that quantum computers offer exponential advantage over classical ones. That is, that there are functions computable by the mathematical model of (uniform) quantum circuits that require exponential time to compute by Turing machines. (In complexity jargon, this is the conjecture that BQP \not\subseteq SUBEXP where the latter stands for the class TIME(2^{n^{o(1)}}).) Integer factoring is one problem that is conjectured to lie in BQP \setminus SUBEXP (i.e., where quantum computers have an exponential advantage). One can also consider analogous conjectures for sampling problems, and some particular sampling tasks that can be achieved in quantum polynomial time have been conjectured as requiring exponential time for probabilistic Turing machines.

Superiorita is the world in which most quantum computing researchers think we live in, and, judging by the hundreds of millions of dollars of investments, many commercial companies and funding agencies as well. Note that this is a mix of both a physical assumption (that the model of BQP can be physically realized) and a mathematical assumption (that this model offers exponential speedup over classical machines). Without assuming both the physical and mathematical aspects of superiorita there would be no justification for investing huge efforts in building quantum computers.

In Superiorita quantum computers are not a panacea and in particular they can’t solve NP complete problems. Let me not wage into the (hugely important!) question of whether in Superiorita the Lattice Shortest Vector Problem is in BQP or not (see my essay  for more on this topic). Also, even if one  believes we live in Superiorita, whether or not the particular problems on which quantum computing offer exponential speedup are interesting is a matter of taste. As far as I know, factoring large integers is not inherently interesting in its own right, and once the world moves to different encryption standards, the applications to breaking encryption will eventually disappear. However, there are other tasks where quantum computers seem to provide exponential speedups and that can be interesting in their own right in areas such as chemistry and machine learning (though one should read the fine print).

Popscitopia is the “hyper superiorita” world where quantum computers can solve NP complete problems. That is, in popscitopia quantum computers can be built, and NP \subseteq BQP. This is the world that is described by some popular accounts of quantum computers as being able to “run exponentially many parallel computations at once”, a belief that is prevalent enough that Scott Aaronson devotes the tagline of his blog to refuting it. Most researchers in the area believe that, regardless of whether quantum computers can be physically be built, they cannot solve NP-complete problem (a belief which is essential to so called “post quantum cryptography”), and indeed so far we have no reason to think quantum computers offer exponential (or even better than quadratic) speedup for such problems. But, we have no proof that this is the case, and indeed, some TCS researchers, as Richard Lipton, have suggested that even NP = P (which in particular implies NP \subseteq BQP) might be true.

Skepticland is the world where it is not possible to build scalable quantum computers, though mathematically they do offer an exponential advantage. That is, in Skepticland, for every function F (and more generally a promise problem or a sampling problem) that can be computed using T amount of physical resources, there is a probabilistic Boolean circuit of size polynomial in T that computes F as well. However, mathematically, like in Superiorita, it is still the case in Skepticland that BQP contains functions (such as integer factoring) that require exponential time to be computed classically.

Skepticland is the world that “quantum computing skeptics” such as Gil Kalai, Leonid Levin and Oded Goldreich think we live in. In this world the extended Church-Turing hypothesis hold sway and there exists some (yet unaccounted for) cost that blows up exponentially in T when trying to physically realize size T quantum circuits.

These skeptics still accept the mathematical conjecture underlying superiorita that BQP contains functions that require exponential time for deterministic or probabilistic Turing machines. Indeed, as far as I can tell, their belief in the inhrent difficulty of problems such as factoring is a large part of the intuition for why quantum computers would not be physically realizable.

Finally, Classicatopia is the world where BQP \subseteq BPP and more generally any function, promise problem, or sampling problem that can be solved by (uniform) quantum circuits can be solved by probabilistic Turing machines with a polynomial overhead. In this world quantum computers can be physically realized, but only because they are no more powerful than classical computers. Hence the Extended Church-Turing holds but for a completely different reason than in Skepticland. In Classicatopia we can simulate the entire physical world using a classical computer. One advocate of this world is Ed Fredkin (who interestingly was the person who motivated Richard Feynmann to propose the possiblity of quantum computers in the first place). Also, several researchers (such as Peter Sarnak) have suggested that the marquee problem of integer factoring can be solved by polynomial-time Turing machines.

Truth and beauty

At this point I should probably talk about the evidence for the probability of truth of each of these scenarios, and discuss the latest advances in experimental works building quantum computers. But frankly I’d be just parroting stuff I Googled, since I don’t really know much about these works beyond second or third hand reports.

Rather, I’d like to talk about which of these worlds is more beautiful. Beauty is in some ways as important for science as truth. Science is not just a collection of random facts but rather a coherent framework where these facts fit together. If a conjecture is “ugly” in the sense that it does not fit with our framework then this can be evidence that it is false. When such “ugly ducklings” turn out to be true then this means we need to change our standards of beauty and come up with a new framework in which they fit. This is often how progress in science is made.

While I am not a physicst, I believe that quantum mechanics itself followed exactly such a trajectory. (I am probably making some historical, physical, and maybe even mathematical mistakes below, but I hope thebigger picture description is still accurate; however please do correct me in the comments!)

The ancient greek philospher Democritus is often quoted as saying “Nothing exists except atoms and empty space, everything else is opinion.” This saying is usually interpreted as an emprical hypothesis about the world, or to use mathematical jargon, a conjecture. But I think this is really more of a definition. That is, one can interpret Democritus as not really making a concrete physical theory but defining the allowed space for all physical theories: any theory of the world should involve particles that mechnically and deterministically evolve following some specific and local rules.

Over the coming years, scientists such as Newton, Leibniz and Einstein, took this prescription to heart and viewed the role of physics as coming up with ever more general and predictive theories within the Democritus model of deterministic particles with no randomness, intent, or magic such as “action at a distance”. In the late 1910’s, Emmy Noether proved some remarkable theorems that derived conservation laws from physical theories based only on the fact that they satisfy certain symmetries (see also my recent post). While the mechanical clockwork theories satisfied such symmetries, they are not the only theories that do so. Thus Noether’s theorems showed that even non-clockwork theories could still satisfy a more general notion of “mathematical beauty”.

At the time Noether’s Theorems were just a very useful mathematical tool, but soon nature gave some indications that she prefers Noether’s notion of beauty to Democritus’. That is, a series of experiments led to the introduction of the distinctly “non clockwork” theory of quantum mechanics. Giving up on the classical notion of beauty was not easy for physicsts, and many (most famously Einstein) initially thought of quantum mechanics as a temporary explanation that eventually will be replaced by a more beautiful “Democritus-approved” theory. But Noether’s results allowed to make quantum mechanics not just predictive but beautiful. As Nima Harkani-Hamed says:

Newton’s laws, even though they were the first way we leaned how to think about classical physics, were not the right way to make the jump to quantum mechanics. … [Rather] because the underlying ideas of the action– and everything just really ports beautifully through, from classical to quantum physics, only the interpretation changes in a fundamental way– all of Noether’s arguments, all of Emmy Noether’s arguments about conservation laws go through completely unscathed. It’s absolutely amazing. All these arguments about conservation laws, many other things change, tons of other things changed when we went from classical to quantum. But our understanding of the conservation laws, even though they’re come up with by this classical physicist a hundred years ago, are equally true in quantum mechanics today.

Moreover, my outsider impression is that with time physicsts have learned to accept and even grow to love quantum mechanics, to the degree that today many would not want to live in a purely classical world. If you wonder how anyone could ever love such a monstrosity, note that, as Scott Aaronson likes to say, there is a sense in which the relation between quantum and classical physics is analogous to the relation between the \ell_2 and \ell_1 norms. I think most mathematicians would agree that the former norm is “more beautiful” than the latter.

My personal opinion

So, which is the most beautiful world, Superiorita or Skepticland?

If you’ve asked me that question a decade ago, I would have answered “Skepticland” without hesitation. Part of the reason I got into computer science is that I was never good at physics and didn’t particularly like it. I also thought I could avoid caring about it. I believed that ultimately the world is a Turing machine or cellular automata and whether it has 5 or 12 particles is about as interesting as whether the computer I’m typing this on uses big endian or little endian representation for integers. When I first heard about quantum computing I was hoping very much that there is some inherent reason it can never work so I can avoid dealing with the ugliness of quantum mechanics and its bracket notation.

But as I’ve learned more about quantum mechanics, I’ve grown not just to accept it as a true theory but also beautiful, and with this to also accept quantum information and computation theory as a beautiful generalization of information and computation in its own right. At the moment I don’t see any beautiful alternative theory (to use Aaronson’s terms, a “Sure/Shor separator”) from the skeptics. The closest we have to such a theory comes from Gil Kalai, but as far as I can tell it posits noise as a new fundamental property of nature (the Ka-la-ee constant?). Noise here is not the usual interpretation of quantum probabilities or the uncertainty principle. It seems to be more similar to the engineering form of noise as inaccuracies in measurements or errors in transmissions. These can be serious issues (for example, I believe that friction is a large part why actually building Babbage’s Analytical Engine was so difficult). But as far as I can tell, these engineering difficulties are not fundamental barriers and with sufficient hard work and resources the noise can be driven down to as close to zero as needed.

Moreover some of the predictions involve positing noise that scales with number of qubits in the computer. It seems to require nature to “know” that some physical system in fact corresponds to a logical qubit, and moreover that two distant physical systems are part of the same quantum computer. (I should say that Gil Kalai disagrees with this interpretation of his conjecture.) While one could argue that this is not more counterintuitive than other notions of quantum mechanics such as destructive interference, entanglement, and collapse under measurements, each one of those notions was only accepted following unequivocal experimental results, and moreover they all follow from our modelling of quantum mechanics via unitary evolutions.

The bottom line is that, as far as I can tell, Superiorita is the most beautiful and evidence-supported world that is currently on offer.

Will we see a mega-qubit quantum computer?

The current experimental efforts are aimed at building a 50 qubit quantum computer. This sounds impressive until I remember that the VIC 20 I played with as a third-grader more than thirty years ago already had 5K (i.e., about 40,000 bits) of memory. So, will we ever see a quantum computer big enough to run Frogger? (not to mention Ultima IV )

The answer to this question depends not just on the science but also on economics and policy as well. Suppose that (with no real justification) that eventually we will able to produce a quantum computer at a cost of 1000 dollars per qubit. Then a million qubit machine will cost a billion dollar to build. The current applications of quantum computers do not seem to justify this cost. As I mentioned, once we transition to different cryptosystems, the motivation for factoring integers will be significantly lessened, and while simulating quantum systems can be important, it’s hard to see it as forming the basis for a billion dollar business. Of course, this can all change with a single theory paper, just as Peter Shor revolutionized the field of quantum computing with a single algorithm.

Moreover I hope that at some point, policy makers and the public at large will stop viewing computer science just through the lens of applications, and start seeing it also as a fundamental science in its own right. The large Hardron Collider apparantly cost about 13 billion dollars to build and operate, and yet the same analysis calls it a “bargain” in terms of the benefit from both technologies invented and scientific discovery. The case can be made that building a large scale quantum computer would be no less important to science, and would offer no less benefit to society. Indeed, a quantum computer offers literally an exponential number of potential experiments one can run on it. Moreover, there is absolutely no reason to think that Shor gave the final word on breakthrough algorithms that could use such a computer for tasks that a priori seem to have nothing to do with physics. In that vein, I hope that whatever bodies that fund experimental quantum computing research realize that at least part of their investment should go into theoretical work in quantum (and also classical, as the two are intertwined) algorithm design.

Acknowledgements: Thanks to Gil Kalai and Scott Aaronson for comments on earlier versions of this post. Needless to say, they are not responsible for anything that I said here.

25 thoughts on “The different forms of quantum computing skepticism

  1. This is a great essay! The part about QM growing on you with time—as a beautiful way for the world to be, rather than an ugly and temporary hack—says something that I’ve been trying to say to QC skeptics for years better than I’ve said it

    Just one small comment: I think that quantum simulation is EASILY the basis for a billion-dollar business, since in some sense, all of materials science, chemistry, drug design, etc. etc. is downstream of it. I think the real economic question, which might actually affect whether scalable QCs can get built anytime soon, is “sure it’s a billion-dollar business, but is it a trillion-dollar one?”

    1. Thanks Scott! You are absolutely right that I shouldn’t have dismissed the commercial potential for quantum simulations. In fact, predicting applications of new technology can sometimes be harder than predicting its existence. (E.g., Babbage and Turing predicted classical computing but I don’t think they imagined what we will use it for.)

    2. Just a citation about the proportions of some efforts:

      “Erdős asks us to imagine an alien force, vastly more powerful than us, landing on Earth and demanding the value of R(5, 5) or they will destroy our planet. In that case, he claims, we should marshal all our computers and all our mathematicians and attempt to find the value. But suppose, instead, that they ask for R(6, 6). In that case, he believes, we should attempt to destroy the aliens.”

      1. That’s a nice story 🙂

        Perhaps one can add that if the aliens prove to us (perhaps via IP=PSPACE) that they know the value of R(7,7) then we should just submit since resistence against such powerful beings is futile 🙂

  2. nice riff on impagliazzos worlds, a truly classic paper that deserves even much more publicity than it gets, and maybe youve already cast the die wrt naming these classes, although if you ask me it would be nice to have a naming competition. honestly, “skepticland” sounds uncool. maybe there is a way to make it sound cooler or more realistic merely by naming it differently and so-called skeptics should be given some chance to frame their position differently. as psychologists understand “its all in the framing”. one might argue the laws of thermodynamics seem to limit physics dynamics already with classical systems and that maybe some limit on QM computing is thermodynamics related. the concept of “heat death of the universe” sounds pretty bleak/ limiting but its in fact a scientifically provable fact. similarly “perpetual motion machines” are ruled out by thermodynamics. at times qm computing sounds a bit magical and verging on a sort of perpetual motion machine.

  3. Very nice essay, Boaz.

    When you write:

    “as far as I can tell, these engineering difficulties are not fundamental barriers and with sufficient hard work and resources the noise can be driven down to as close to zero as needed.”

    This is precisely the crux of matters.

    Recently, Scott expressed the idea that while in our planet quantum supremacy will first be demonstrated with small quantum circuits in some other planets, BosonSampling will come first. In this spirit let me offer a hypothetical conversation between the members of the threshold theorem team A and B in yet another planet X.

    A: We completed the proof. The Threshold theorem is proven!

    B: This is cool, I expected that we will prove that any constant level noise makes quantum computation impossible. What a great and surprising result!

    A: If quantum mechanics is valid and if future quantum engineers can devise a controllable many qubit system with noise below the threshold then reliable large-scale quantum computing will be achievable!

    B: Yes, if quantum mechanics is valid, and if mathematics is consistent, and if we are not dreaming… but wait, we could have said the same thing before our threshold theorem when the required noise depended on the number of qubits.

    A: Come to think of it, to build the code needed for our theorem we need maybe 400 qubits so this means that our future quantum engineers will achieve supremacy well before quantum error correction comes to play.

    B: But without error correction these quantum computers are very primitive computing devices. How can we achieve supremacy with these primitive noisy machines, which like Babbage’s Analytical Engine are even more primitive than digital computers?

    A. ahh this means that your initial intuition was right after all. OMG Our theorem is a red herring.

    B: And a beautiful one it is! It is a great theorem!!! But we should add to the paper a remark explaining the correct plausible interpretation.

    A: For a minute I thought that the theorem shows that these quantum computers are realistic.

    B: ha ha

    A: I think I will study next the transition between our beautiful quantum world above the threshold and the flashy quantum fantasy below the threshold.

    Remark: In planet X the quantum supremacy notion and results came before the threshold theorem. Also, the grass in this hypothetical planet need not be greener than ours and I personally see clear advantages to our planet.

    Anyway, experiments in the near future will tell us a lot. Let’s wait, see, and enjoy.

    1. I actually agree that to a certain extent the quantum threshold theorem is a “red herring”. Just
      as we don’t inherently *need* Shannon’s theory of error correcting codes to communicate, we don’t inherently *need* quantum error correction to do quantum computation.

      In many settings error correcting codes are extremely important, and can make a huge practical difference. Similarly I believe that quantum error correcting codes can make an important practical difference in constructing quantum computers.

      But even if the quantum threshold theorem was not known (or not true) we could still hope to construct quantum computers. In particular, as far as I can tell, there is no reason why the noise level for an individual gate or qubit will not go to zero at least as a polynomial function of the amount of resources you invest in constructing this gate. (The threshold theorem might be used to show that this should go down at an exponential rate but inverse polynomial should be only easier.)

      1. I don’t know much about quantum computing, but the dismissal of error correcting codes as merely an engineering tool (in your second response below) rings hollow to me. My gut feeling is that error correction is a fundamental requirement of effective computing. I have two reasons. Firstly, error correction is one of the main reasons we digitize data and do not have analog computers. We have a variety of physical implementations of memory and yet we have not been able to circumvent error correction. Secondly, we have one example of a natural computer (in the concrete sense that it stores and executes programs) that we more or less understand, DNA based genomes. This computer, too, is digital and uses error correction. The main function of this computer is, of course, replication. Now, it’s true that most of us (?) do not try to mate our iPhones with high hopes, but I cannot think of computing as we know it without replication. How do you implement recursion otherwise? I think that to claim that error correction is merely an engineering tool is analogous to claiming that exponential blowup in resources is merely an engineering obstacle. At any fixed rate of error, as the number of basic elements (say, data bits) of your computing device grows, the chance of no error drops exponentially. I think a similar phenomenon happens regarding miniaturization. The stability of elements should drop exponentially with size. If you want to handle computations on large inputs, you need a device with many elements. You also need to pack them in the space that is accessible within the time frame of the computation, and this imposes some physical limits. So I think you need either high rate error correction or fault tolerant algorithms (which I think can be thought of as task specific error correction) to perform computations. It’s not a coincidence that we needed both of Shannon’s ideas, switch-based boolean algebra and error correction, to build computers.

      2. It’s not really well defined to talk about whether a mathematical theorem which is true is necessary or not, so let me not try to argue whether Shannons Theorem (which I take to be about the existence of good codes: ones that can correct a constant fraction of errors) are necessary for computing or whether repetition codes or others would have been enough.

        All that I meant is that with enough resources one can manufacture a single component (e.g. gate) whose failure probability is arbitrarily close to zero.

        “Engineering” here is not a derogatory term but it is in contrast to “no go” theorems such as the uncertainty principle that tells you that some quantities can never be measured no matter how much resources you expend.

      3. Historically, two of the discoverers of the threshold theorem Michael Ben-Or and Ray Laflamme, expected a negative results and regarded (unlike Boaz) a negative result as a way to debunk quantum computers.

        The rationale for this view is the following: if you regard the noisy quantum circuit as the correct physical model for computations allowed by quantum devices, and if for every fixed amount of noise the computational complexity allowed by the model is BPP, then this reflects on what you can expect from engineering efforts, and gives a strong reason to think that BPP is indeed the correct complexity class for noisy quantum computers.

      4. Gil this is very interesting that Ben Or and Laflamme thought that the Threshold Theorem is essential for quantum computing. I am not sure it makes mathematical sense, but if the the threshold was 1/log n instead of a constant, would they consider that a death blow to quantum computing?( Of course now that it has been proved we know that the Threshold Theorem is essential for 1+1=2 🙂 )

        These researchers (and you) know much more about this than me. My intuition is quite simple: suppose that you came to a computer chip manufacturer and told them that instead of having a $100 price tag they are allowed to use a budget of $1,000,000 and that instead of having a chip that fits inside a pocket they can make it as big as a house. I would imagine that given these additional resources they would be able to make the chip more reliable, and I am not sure why things would be different for constructing quantum qubits.

      5. Boaz, this is not a hypothetical question. Sometime before the threshold theorem was proved, Peter Shor had made a crucial breakthrough (also) regarding fault-tolerant quantum computing and gave a FTQC scheme against 1/polylog noise.

    2. Boaz, just to make it clear I do not think at all that the threshold theorem is a red herring. This was just part of the amusing conversation between A and B in planet X. The threshold theorem (both in our planet and in planet X) is extremely important and it will be the basis for building quantum computers if they can be built and will also be the basis for debunking quantum computers if they can be debunk.

      The threshold theorem shows that once high quality quantum computers are built for roughly 100-500 qubits then it will be possible in principle to use quantum codes to amplify this achievement for building quantum computers with unlimited number of qubits. This is correct and important! The interpretation of the theorem took for granted that quantum computers with a few tens of qubits are feasible. There are, in my opinion, good theoretical reasons to think that this is incorrect and, in any case, this is being tested in many experiments by many groups now and in the near future.

      No, Boaz, without the apparatus of quantum error-correction and the threshold theorem (either in the software or the hardware, e.g. via anyons) there is no hope to construct quantum computers. The fact that quantum supremacy is easier to demonstrate than it is to build the necessary codes gives a strong reason to think that also the apparatus of quantum error-correction and the threshold theorem is beyond reach.

      1. I think this is indeed the crux of our disagreement.

        I view quantum error correction as very analogous to classical error correction: an important engineering tool but not the thing that makes the qualitative difference whether scalabale communication or computation is possible.

        This has to do with viewing the “engineering form of noise” as not being a fundamental barrier but rather, like measurement inaccuracies, something that can be reduced at the expense of using more resources.

        A related issue is that I think of error correction, logical qubits, etc.. as higher level abstractions on top of the underlying physics. If there is a fundamental physical barrier to scalable quantum computing then I’d think it should have an explanation in terms of these underlying physics and not in terms of these abstractions.

    3. One can make a finer division of the superiorita world, and, in my view, the three most relevant world are the following:

      Quantopia superiorita – The model of quantum circuits is the correct description of local quantum systems in nature and therefore universal quantum computation is possible. Quantum systems, quantum information and computation are analogous to classical systems, classical information and computation. Quantum error correction is analogous to classical error correction: an important engineering tool but not the thing that makes the qualitative difference whether scalabale communication or computation is possible.
      (This represents Boaz’ beliefs from the comments above)

      Noiseland superiorita (quantum noise below the threshold). Quantum systems are inherently noisy. Time-dependent quantum evolutions necessarily interacts with the environment and are therefore  noisy. The model of noisy quantum circuits is the correct description of local quantum systems in nature. Quantum error-correction  shows that small islands representing noiseless quantum circuits can be created and also exist in nature. Hence quantum computation is possible. (This is perhaps the most common view among researchers in quantum information.)

      Noiseland sceptica– (quantum noise above the threshold) Quantum systems are inherently noisy. Time-dependent quantum evolutions necessarily interacts with the environment. The model of noisy quantum circuits is the correct description of local quantum systems in nature. The noise level cannot be pushed below the level allowing quantum error-correction and quantum fault-tolerance. Hence quantum computation is not possible. (This is were I stand).

      There are, of course people who are skeptical about quantum computers from other reasons like the young Boaz who simply did not like physics.

      Quantopia superiorita suggests that “quantum supremacy” and “robust quantum information” will be present and in fact ubiquitous in the physical world, while under “noisland superiorita quantum supremacy represents rare a islands inside a large “decoherence desert” (as Daniel Gottesman referred to it in his beautiful picture portrayed in this post.) The difference between quantopia superiorita and noisland superiorita is relevant to Scott’s hope that quantum computers promise a trillion dollars industry via simulations. While additional computing power is always welcome this idea is less promising in the noiseland superiorita scenario where the usefulness of quantum computers to simulation less clear.

  4. Nice essay. Let me suggest “Supremazzia” insofar as we speak of “quantum supremacy”. With two z’s it’s Corsican rather than Italian—and guess who came from Corsica? Or “Supremazia” or “Supremacia” which is Portuguese, Catalan, and Spanish (the last with accented i).

Leave a comment