## The Territory Around BQP

A commenter named Blake Stacey pointed me to a talk entitled The Territory Around BQP: Results and Open Problems, which was given at the Perimeter Institute this past Friday, and which I’d had no idea was available on streaming video. This talk was part of a fantastic workshop called Conceptual Foundations and Foils for Quantum Information Processing, which was about ways of changing the laws of quantum mechanics to get alternative theories that still make some sort of sense, and that might shed new light on the “tried-and-true original.” In this particular talk, the speaker discusses a large number of ways to make the complexity class BQP (Bounded-Error Quantum Polynomial-Time) “slightly” bigger or smaller. I’m embarrassed to admit that I watched this particular talk transfixed to the computer screen: I genuinely couldn’t predict how BQP was going to get mutilated next, and I looked forward to finding out.

Comment #1 May 16th, 2011 at 6:31 pm

Hi Scott,

Just a thought I had when viewing the first 5 minutes of your talk: Is there a good Quantum communication protocol for graph isomorphism? I.e. Alice gets a graph G, Bob gets a graph H, they want to know whether G and H are isomorphic, and they have Quantum communication with shared EPR pairs. If there is such a protocol with polylogarithmic communication complexity, it would be some sort of indication that GI is easy in the quantum world, and the other way around as well. If it’s not known, then it seems like an intriguing question to study.

Comment #2 May 16th, 2011 at 9:40 pm

e.v.: Nice question! As the first step, who can tell us what’s known about its

classicalcommunication complexity?Comment #3 May 17th, 2011 at 12:11 pm

In the classical world, there’s a O(1)-bit communication protocol using public randomness. Note that graph-isomorphism partitions the set of graphs into equivalence classes. (e.g. G and H are in the same equivalence class iff they are isomorphic)

The protocol is then just an EQUALITY test on the equivalence class.

Comment #4 May 17th, 2011 at 12:55 pm

Joshua: DUHHHHHHH [bangs forehead on table]

Thank you. This problem is nothing more than a disguised version of EQUALITY.

Withoutpublic randomness, then, you can solve it with log(n) bits, while quantumly log(n) qubits are again necessary and sufficient.Next step: can we find some less trivial variant of the problem?

(Actually, here’s a variant that occurs to me now: given her graph G, is there an n

^{o(1)}-bit randomized message that Alice can generate in polynomial time, such that given Alice’s message together with his graph H, Bob can decide whether G and H are isomorphic with no computational complexity restriction? Conversely, what if we put a computational complexity restriction on Bob, but no restriction on Alice?)Comment #5 May 17th, 2011 at 2:41 pm

Nice talk. But I did not really see the connection between the subclasses of BQP and modifications of quantum mechanics.

Do these various subclasses really correspond in a natural way to modifications of quantum mechanics?

By way of contrast, LOGSPACE is an interesting subclass of P but we do not really think of it as a modification of the laws of physics.

Comment #6 May 17th, 2011 at 4:11 pm

Jr: It’s a valid point. We can construct complexity classes by restricting BQP in various ways, but then there remains the question of whether there’s any plausible “picture of physical reality” that would give rise to those classes.

I think the case is stronger for some subclasses than others. For example:

- BQP with huge depolarizing noise corresponds to a picture of reality in which there’s some fundamental (gravitational?) decoherence source acting on every qubit, and no possible way to get rid of it.

- BQP with a restricted gate set corresponds to a picture of reality in which the physical Hamiltonian only allows those particular gates.

- The nonadaptive linear-optics model corresponds to a “universe composed entirely of free photons,” with no electrons or anything else to mediate the electromagnetic interaction.

- The separable mixed state model corresponds to … well, as I said, that one’s pretty hard to motivate, except by reference to the original question that gave rise to it (“is entanglement needed for quantum speedups?”)

As Dave Bacon and I discussed a while ago, one condition that you might want to impose on a complexity class C, in order for C to correspond to a “realistic physical model of computation,” is

C

^{C}= C.In other words, C with a C oracle should be no more powerful than C itself: the model should be “closed” under subroutines and recursion. Interestingly, P, BPP, BQP, and PSPACE all satisfy that requirement, but NP probably

doesn’tsatisfy it, and EXP definitely doesn’t.Unfortunately, for many of the “foil classes” that I discussed in my talk, it’s not even clear what it

meansto feed those classes an oracle! For example, how do you query an oracle in the nonadaptive linear-optics model, without the oracle transformationitselfboosting you up to full universal quantum computation?Comment #7 May 21st, 2011 at 2:26 pm

Decent talk Scott but I’d much rather see video of the lecture you gave at CMU. I heard from people who were that it was electrifying. When will it be available?

Comment #8 May 23rd, 2011 at 6:26 am

First of two comments: Your talk astutely begins with the introductory remark that quantum mechanics looks rigid in the sense that it is very difficult to modify it and have a consistent physical universe. It was astute, yes, but it was too brief. I think that more time could have been spent on a similar remark more closely related to your talk: It is very difficult construct a consistent *computational* universe other than the commonly discussed examples, one of which is quantum computation. As you well know, a complexity class is a simplistic summary of an entire model of computation. A serious model of computation could be expressed as an entire range of complexity classes which must eventually be self-consistent in various ways.

Some of the alternate models that you discuss hardly seem to point to any reasonable computational universe. A case in point is models with non-linear modifications of the Schrodinger equation. Yes, changes like that could lead to superluminal communication — that is how a physicists might argue that the model isn’t really consistent. But, if the amplitudes of such a modified equation are still meant to lead to probabilities, then I have not seen that any such modification could even be computationally consistent.

Certainly the specific results and questions that you address in this type of talk are very interesting. But I have had the sense that in debating these alternate models, you’ve been challenging hospital patients to tennis matches because some people don’t realize that the patients are sick. Yes, you do mention briefly that the patients have physical diseases. But they also seem to have mental diseases, and the syndrome isn’t fully revealed by only looking at separate complexity classes.

Actually in comment number 6, you make a related remark, that it’s not even clear what it means to feed oracles to some of these classes. I would argue that any viable model of computation *must* include interaction with oracles, since after all computer programs can have plug-in subroutines. So you seem to be saying, the class is not sane, but let’s waive that requirement and challenge it anyway.

Comment #9 May 23rd, 2011 at 6:33 am

Comment two of two: Midway through your talk you have on the board the question of whether BQP = BPP^BQNC. So, I would ask whether the dihedral hidden subgroup problem is evidence against. Yes, the algorithm does not take polynomial time, it takes stretched exponential time f(n) = exp(O(sqrt(n))), where n is the length of the output or the length of one oracle query. But let’s make it polynomial time, in fact linear time, in a variable k = f(n), by the trick of writing the input in a very verbose way. (In any case, the input is just the index to an infinite sequence of finite oracles; we can make the sequence of oracles grow slowly.) Then the fastest classical algorithm for DHSP is not polynomial time. And, beyond a classical algorithm; I also do not know an algorithm in BPP^BQNC.

Comment #10 May 23rd, 2011 at 12:19 pm

Hi Greg,

Re your second comment, that’s an interesting example! I hadn’t considered it.

Re your first comment, since it’s not so much you disagree with me as that you wished I’d stressed certain things more, here’s a point

I’llstress thatyouprobably won’t disagree with. Sure, the foils of BQP that I talked about (separable-mixed-state QC, nonadaptive linear optics, QC with non-collapsing measurements…) are hospital patients with various severe physical and mental disabilities. But they’re the only tennis players I could round up who were even willing toenter the courtwith BQP! And sometimes they’ve even returned BQP’s volleys in interesting and nontrivial ways (i.e., have led to new results and questions about standard quantum computation). So, since I was asked to speak at a “foils workshop,” what else should I have filled my talk with?Or maybe you find this entire topic unworthy of a talk—but you do say that you find the specific results and questions interesting. So maybe it would’ve been fine if I’d changed the title to something like, “

In Search OfA Foil for BQP”?Comment #11 May 23rd, 2011 at 12:37 pm

On balance the topic clearly is worthy of a talk, but yeah, I would suggest a different emphasis. Or at least, a different emphasis is in my mind when I hear the talk. The talk opens with a dichotomy between physics and computer science, one that plays to the viewpoint that quantum mechanics is some computationally exploitable physics that (maybe) has been offered to computer scientists.

Another viewpoint is that quantum probability is a common mathematical basis for both quantum mechanics and quantum computation. Of course, from that viewpoint, if you wanted to replace quantum probability by something else, you would want something logically viable. After all, no one thinks of randomized computation as a resource magically created by physics, even though it can be exactly that, if you use a physical random number generator.

Really my position is similar to that of Jr. In more explicit terms, if I had to fill in for your talk and I were discussing, say, Tree-BQP, I might say this: “From the outset Tree-BQP vs BQP doesn’t look like a fair comparison. BQP comes from a reasonable computational model and a reasonable physical model. Whereas Tree-BQP isn’t part of a consistent computational universe or physical universe as far as anyone knows. Which indeed could a message of the following theorem…”

Comment #12 May 23rd, 2011 at 12:42 pm

To be even more specific: Dividing a Hilbert space into qubits or qudits is negotiable. For instance, if you have an atomic crystal, you could regard the position of each nucleus as a separate qudit. Or you could instead describe the same information with photon occupation numbers. Does Tree-BQP even equal itself if you shift between such descriptions?

Comment #13 May 23rd, 2011 at 1:25 pm

@Greg Kuperberg #8

>A serious model of computation could be expressed as an

>entire range of complexity classes which must eventually be

>self-consistent in various ways.

If I understand C^C = C is one of these ways. What are the other ways you’re thinking off?

Comment #14 May 23rd, 2011 at 2:33 pm

Greg: TreeBQP should be invariant under “local” transformations of the state (e.g., from qubits to ququarts and vice versa), but I don’t know if it’s likewise invariant under the “nonlocal” transformation from qubits to qutrits. (It might be; I just don’t see a proof right now.)

As I pointed out in the talk, quantum computing with separable mixed states isn’t known to be invariant under

anyof these transformations (even between real and complex amplitudes).On the other hand, at least

oneof the models I talked about—the one-clean qubit model—is surprisingly robust (among other things, it equals the 2-clean-qubit model, the 1-clean-qutrit model, and so on).To play devil’s-advocate, suppose BQP had been invented before quantum mechanics was discovered. Then I imagine someone would’ve said: “sure, that’s an interesting complexity class, but it obviously isn’t part of any consistent physical universe or computational universe, since you had to introduce this bizarre ‘measurement’ rule that isn’t part of the theory’s dynamical content…”

And yes, there would’ve been reasonable responses to such a person. All I’m saying is that, once you’ve decided to play this game of imagining how the “laws of efficient computation” could be other than we currently believe they are, you have to be willing to break

something! And, if you always find yourself breaking too much, then maybe the end result really will be increased confidence that the laws couldn’t have been otherwise. But it’s not as if the effort that’s gone into thinking up BQP alternatives has been that enormous…Comment #15 May 23rd, 2011 at 3:38 pm

Scott – Well you make a good point. And actually, when I look at the video for this version of your talk, I’m getting the feeling that I’m arguing against a straw man, to some extent. I remember an earlier version of this talk which was aimed much more directly at QC skeptics. The debate then, as you set it up in your talk, was somehow more off-the-wall, to a degree that I found confusing along the lines of what I’ve been saying. But this version seems to offer more sober reasons for comparing these classes, and as you say, you already have comments about how some of these models are contrived.

simple mind – Well, that’s a good question too. I would suppose that an equation like C^C = C is not enough. For one reason, because you would want reasonable restrictions one what it should mean to exponentiate complexity classes, or in other words what it should mean to call an oracle. I don’t have any great ideas for such axioms, nor what else should be required.

Anyway, Scott, one particular reasonable response to the devil’s advocate criticism of BQP is that unitary operations and measurements is the wrong definition of quantum probability/computing/mechanics to begin with! It’s the simplest definition, but it’s simple to the point of being simplistic. A better definition is that the gates or operators should be general quantum operations. If you do it this way, then measurement IS part of the theory’s dynamical content, and observers can also exist on the dynamical stage. Besides, the definition using unitary gates does not yield the correct definition of QSPACE(n^α) for any fixed α (unless you allow measurement-adapted quantum gates, I guess), because Stinespring’s theorem is not a space-efficient construction.

Comment #16 May 23rd, 2011 at 3:50 pm

Okay, here is a remark derived from a discussion with Gil Kalai. Even though in your talk “foils” evolved from “QC skepticism at all costs” into something more robust, this remark does illustrate how a desire to restrict QC can be undermined by closure properties.

Suppose that we allow QC circuits of mixed bits and qubits, together with qubit creation and measurement gates. If at certain time intervals, a quantum executioner comes along and destroys all of the qubits but leaves the bits, then you get (I think) BPP^BQNC, assuming that the executioner allows only a polylog amount of time between death sweeps.

But you could argue that that is too contrived or too unfair. Suppose instead that the executioner works asynchronously, so that there does not exist a quantum path through the circuit of more than polylog depth. Or heck, of no more than constant depth. Then, remark: Using quantum teleportation, you can get all of BQP! Teleportation is the quantum Twitter that keeps the resistance alive even though every rebel is soon executed.

Comment #17 May 24th, 2011 at 1:29 am

One interesting question is what is the complexity class which is most appropriate to describe the power of quantum computers themselves. Perhaps QSAMPLE (the description of probability distributions that a quantum computer can achieve) is more appropriate than BQP? There are some indications that QSAMPLE is more powerful than BQP and perhaps it is more natural to regard nature in terms of feasible probability distributions rather than in terms of decision problems. (Both BQP and QSAMPLE involves puting some “classical” ingredient into the quantum process. Perhaps what quantum computers really can do should be expressed by an even larger class. You can simply consider the class QEVU of quantum evolutions which can be described by quantum computers.)

The question which computational complexity class are hypothetically describing “realistic physical model” even in a very non realistic senes is indeed very interesting. A major difficulty is that relating physics to computational complexity is rather difficult but physics seems much closer to quantum error correction.

Another “computational class” (in a loose sense of the term) of interest to quantum skeptics is the following: Consider the power of noisy quantum computers with arbitrary small but positive level of noise just after Shor’s algorithm was found (and after Bernstein-Vazirani paper) but before quantum error correction and fault tolerance were discovered. Such “computers” do not represent an asymptotic exponential speedup but for the problem of factoring large numbers which is infeasible on classical computers “all you need” is to lower the noise-level below some fixed positive constant. This suggests that even before the “threshold theorem” (and certainly after) the “first order” reason for skepticism is that the noise level

for quantum computers (which actually carry complicated quantum evolutions) will scale up with the number of qubits.

Comment #18 May 26th, 2011 at 9:24 am

Helen Agee #7:

Decent talk Scott but I’d much rather see video of the lecture you gave at CMU.I just checked in with the CMU folks; they say that they’re working on it and the video will be available soon.

Comment #19 May 26th, 2011 at 4:09 pm

In 2008, Scott aaronson said that Dwave trying to line up customers was comically premature.

http://www.scottaaronson.com/blog/?p=306

In 2011, Dwave announced the sale of their Dwave One Quantum computer to Lockheed Martin.

http://www.dwavesys.com/en/pressreleases.html#lm_2011

Lockheed Martin Corporation (NYSE: LMT) has entered into an agreement to purchase a quantum computing system from D-Wave Systems Inc.

Comment #20 May 26th, 2011 at 4:47 pm

Brian, I don’t see any contradiction. D-Wave hasn’t yet demonstrated the ability to

do somethingof commercial value, that couldn’t be done just as well with a classical computer. The fact that they got Lockheed Martin to give them money is no more relevant to the question of whether they can actuallyget quantum speedupsthan the fact that they got venture capitalists to give them money, or (say) the fact that Lockheed Martin gets the government to give it money for equipment the latter is never going to use. Indeed, this is a perfect example of the sort of changing the subject (from the technical to the sociological) that’s made many of us tear our hair out over the past four years!As I wrote here last week, I was genuinely thrilled to see D-Wave finally publish a paper that starts to address the relevant questions, and I immediately gave credit where it was due. I hope D-Wave succeeds at scaling things up and demonstrating entanglement and other characteristic quantum effects. But it’s a long way from a quantum-mechanical temperature dependence in 8 qubits to anything commercially useful, and it’s an insult to people’s intelligence to expect them to take the former as evidence for the latter.

Comment #21 May 29th, 2011 at 11:42 am

Is this a possible foil for BQP? One version of quantum gravity relies upon the Wheeler-DeWitt equation, which is basically a Hamiltonian constraint at each spatial point. If you don’t like continuums, try a lattice regularization instead. I don’t know if a lattice regularization is permissible, but it can give rise to a “fantasy world”, whether or not it corresponds to reality. At each point on the lattice, write down a local Hamiltonian constraint which only depends upon k-neighbors. Unlike a usual local Hamiltonian which is the sum of local terms, and can take on any eigenvalue, quantum gravity is different. We have to impose the Hamiltonian constraint insisting that any solution has to be an eigenstate of 0 for all Hamiltonian constraints at every point. I think such modes have been shown to be QMA-complete for the case where all the constraints commute with each other, but in quantum gravity, the Hamiltonian constraint at a node won’t commute with the Hamiltonian constraints of its neighbors. Time disappears, so I don’t know how you can define polynomial time, but do we get QMA, BQP, or something in between when restricted to physical observables?

Comment #22 June 16th, 2011 at 12:42 pm

http://arstechnica.com/science/news/2011/06/adiabatic-quantum-computers-may-get-speed-boost-by-adding-useless-qubits.ars

(says ‘almost’ there to NP Complete)