The Orion Quantum Computer Anti-Hype FAQ

Grudgingly offered for your reading pleasure, and in the vain hope of forestalling further questions.

Q: Thanks to D-Wave Systems — a startup company that’s been in the news lately for its soon-to-be-unveiled “Orion” quantum computer — is humanity now on the verge of being able to solve NP-complete problems in polynomial time?

A: No. We’re also not on the verge of being able to build perpetual-motion machines or travel faster than light.

Q: But couldn’t quantum computers try all possible solutions in parallel, and thereby solve NP-complete problems in a heartbeat?

A: Yes, if the heart in question was beating exponentially slowly.

Otherwise, no. Contrary to widespread misconception, a quantum computer could not “try all possible solutions in parallel” in the sense most people mean by this. In particular, while quantum computers would apparently provide dramatic speedups for a few “structured” problems (such as factoring integers and simulating physical systems), it’s conjectured that they couldn’t solve NP-complete problems in polynomial time.

Q: But isn’t factoring an NP-complete problem?

A: Good heavens, no! While factoring is believed to be intractable for classical computers, it’s not NP-complete, unless some exceedingly unlikely things happen in complexity theory. Where did you get the idea that factoring was NP-complete? (Now I know how Richard Dawkins must feel when someone asks him to explain, again, how “life could have arisen by chance.”)

Q: How could the people at D-Wave not understand that quantum computers couldn’t solve NP-complete problems in polynomial time?

A: To his credit, Geordie Rose (the founder of D-Wave) does understand this; see here for example. And yet, essentially every article I’ve read about D-Wave gives readers exactly the opposite impression. The charitable explanation is that the D-Wave folks are being selectively quoted or misquoted by journalists seeking to out-doofus one another. If so, one hopes that D-Wave will try harder in the future to avoid misunderstandings.

Q: But even if it gave only polynomial speedups (as opposed to exponential ones), couldn’t the adiabatic quantum computer that D-Wave built still be useful for industrial optimization problems?

A: D-Wave’s current machine is said to have sixteen qubits. Even assuming it worked perfectly, with no decoherence or error, a sixteen-qubit quantum computer would be about as useful for industrial optimization problems as a roast-beef sandwich.

Q: But even if it wasn’t practically useful, wouldn’t a 16-qubit superconducting quantum computer still be a major scientific advance?

A: Yes, absolutely.

Q: So, can D-Wave be said to have achieved that goal?

A: As Dave Bacon pointed out earlier, it’s impossible to answer that question without knowing more about how their machine works. With sixteen qubits, a “working demo” doesn’t prove anything. The real questions are: how high are the fidelities, and what are the prospects for scalability?

Q: But clearly D-Wave isn’t going to give away its precious trade secrets just to satisfy some niggling academics! Short of providing technical specifics, what else could they do to make computer scientists take them seriously?

A: Produce the prime factors of

1847699703211741474306835620200164403018549
3386634101714717857749106516967111612498593
3768430543574458561606154457179405222971773
2524660960646946071249623720442022269756756
6873784275623895087646784409332851574965788
4341508847552829818672645133986336493190808
4671990431874381283363502795470282653297802
9349161558118810498449083195450098483937752
2725705257859194499387007369575568843693381
2779613089230392569695253261620823676490316
036551371447913932347169566988069.

Q: Alright, what else could they do?

A: Avoid talking like this:

The system we are currently deploying, which we call Trinity, is a capability-class supercomputer specifically designed to provide extremely rapid and accurate approximate answers to arbitrarily large NP-complete problems … Trinity has a front-end software interface, implemented in a combination of Java and C, that allows a user to easily state any NP-complete problem of interest. After such a problem has been stated the problem is compiled down to the machine language of the processors at the heart of the machine. These processors then provide an answer, which is shuttled back to the front end and provided to the user. This capability can of course be called remotely and/or as a subroutine of some other piece of software.

Or to translate: “Not only have we built a spaceship capable of reaching Pluto in a few hours, our spaceship also has tinted windows and deluxe leather seats!” If I were them, I’d focus more on the evidence for their core technological claims, given that those claims are very much what’s at issue.

Q: While Dave Bacon also expressed serious doubts about the Orion quantum computer, he seemed more enthusiastic than you are. Why?

A: Generous and optimistic by nature, Dave strives to give others the benefit of the doubt (as the Chinese restaurant placemat would put it). Furthermore, as Quantum Pontiff, he’s professionally obligated to love the quantum sinner and forgive the wayward quantum sheep. And these are all wonderful qualities to have. On the other hand, when the hype surrounding some topic crosses a certain threshold, arguably a pricklier approach becomes called for.

Q: If D-Wave fizzles out, will many journalists and policymakers then conclude that quantum computing is bunk?

A: It doesn’t seem unlikely.

Q: What would it take to get these people to recognize the most elementary distinctions?

A: That’s the question, isn’t it?

Update (2/13): Lawrence Ip, my friend from Berkeley who now works at Google, went to the D-Wave “launch” in person and kindly provided the following report.

I just came back from the D-Wave announcement. Unfortunately I had to leave at the start of the Q&A session.

Here’s what I took away from it (minus the marketing fluff):

– They claim to solve integer programming problems on their system. Geordie Rose was careful to explicitly say that they are only hoping to see a quadratic speedup. Herb Martin (the CEO) wasn’t quite as careful in his opening remarks but then he’s the “suit”. Geordie said that their current chip is not a universal QC (presumably because their space of Hamiltonians is limited) but with some work they expect to be able to make it universal.

– Geordie said compared their approach to the efforts in academia as similar to Celera and the Human Genome Project. He said they were trying to get something that would scale fast and worry about about the quality of the qubits later. He contrasted this to the academic community’s efforts to achieve fine control over qubits before scaling up. They say that they hope to reach 1024 qubits by the end of 2008.

– They demoed 3 “real-world” problems where they used their system as essentially a blackbox IP solver.
– Searching for protein matches. Given a protein try to find the closest match in a protein database. They serially fed all the items from the database to the chip and asked it to score the match against the given protein. They said it was solving a maximum independent set problem.
– Finding the best seating arrangement at a wedding reception. Given constraints on which people can’t be seated together and who wants to sit together, find the optimal arrangement.
– Solving a Sudoku puzzle.

– At one point Geordie quoted you. He excerpted a paragraph from your SIGACT article (the one where you talk about generating Shakespeare’s 38th play) and mentioned your proposal of the inability to solve NP-hard problems as a physical law. As far as I can remember, the only other computer scientist he quoted was Gordon Moore so you’re in good company. 🙂

65 Responses to “The Orion Quantum Computer Anti-Hype FAQ”

  1. Greg Kuperberg Says:

    If D-Wave fizzles out, will many journalists and policymakers then conclude that quantum computing is bunk?

    Actually, most journalists are already in a superposition (*) of both believing in quantum computing and being skeptical of it. Their thinking (**) is in a delocalized haze which will only decohere further with this irresponsible hype from D-Wave.

    (*) Technically speaking, a classical superposition.

    (**) Admittely their thinking is roughly the same with respect to all unproven computer technology. It is particularly frustrating that they blur the difference between Moore’s Law and improved algorithms, much less expanded complexity classes.

    As I like to ask my son rhetorically: If you want to factor a large number, would you rather have the computers of 1970 and the algorithms of 2005, or vice versa?

  2. Lawrence Ip Says:

    Hey Scott, I’m planning to stop by their demo on Tuesday on the way to work (it’s just down the road from Google). I’ll try to take some notes and report back with a summary of what they say.

  3. Geordie Says:

    Greg:

    That’s a pretty strong opinion (“irresponsible hype”) to have without actually knowing anything about the effort or what we’ve achieved. I would suggest that presenting scathing reviews without any evidence is also irresponsible, in that your readers may assume you know something about our stuff, which you don’t.

    I have heard this sort of criticism before and I find it puzzling. I’m assuming you would prefer that these machines exist vs. not. If so do you think they’re just going to magically appear? Someone has to actually build them…shouldn’t you be overjoyed that at least someone is actually trying?

  4. Chris Says:

    I believe DWave’s demonstration is meant to be a proof of principle demonstration only.

    “what else could they do to make computer scientists take them more seriously?”

    Although the 16 qubit chip may seem as interesting as a “roast beef sandwich”, it is only meant to whet computer scientists appetite until the real scaled up sandwich is done.

    http://www.dicksbbq.com/images/photoswip/other/09-sandwich.jpg

    We’re all hungry…but I for one can be appeased with an appetizer until the real sandwich is done.

  5. James Holloway Says:

    This is most interesting…

    EE Times is hardly a metascience rag. Yet they wrote up a rather optimistic article.(link below).

    http://www.eetimes.com/news/latest/showArticle.jhtml;jsessionid=ZBUDS4YYPMT1WQSNDLSCKHA?articleID=197004661

    Sometimes advances in computing can make “quantum” leaps…

    Sorry, I couldn’t resist.

    JimH

  6. Scott Says:

    Hey Scott, I’m planning to stop by their demo on Tuesday on the way to work (it’s just down the road from Google).

    Thanks so much, Lawrence! Hope you’re doing well.

  7. Scott Says:

    Although the 16 qubit chip may seem as interesting as a “roast beef sandwich”, it is only meant to whet computer scientists appetite until the real scaled up sandwich is done.

    Chris: My point was that, if they’re only going to whet people’s appetites, then they have to let them into the kitchen so they can see for themselves if anything bigger is cooking.

    When Chuang et al. used a liquid-NMR quantum computer to factor 15, it wasn’t a major result because of the answer (which I believe was 3×5), but because they told everyone exactly how they got it and what they learned along the way.

  8. Altman Says:

    Hi Scott – It’s possible to register online to review video footage of the event if you’re not on-site :

    ” On Thursday February 15 at 4:00 PM, we will post video of the event at the Computer History Museum available online for those who have completed the following form. “

  9. Altman Says:

    ( … aforementioned link )

  10. Scott Says:

    It’s possible to register online to review video footage of the event if you’re not on-site

    OMIGOD OMIGOD OMIGOD! And here I was, fretting that I’d be left out of this historic moment for humankind…

  11. Altman Says:

    Given that no referees will be present and no technical details of the experiment will be discussed, attendance would be of limited utility for verification in any case. Discussion of fidelity, 1/f noise, Rabi and T2 etc. have not yet been provided as support for any sort of unitary control over the system.

  12. Greg Kuperberg Says:

    That’s a pretty strong opinion (”irresponsible hype”) to have without actually knowing anything about the effort or what we’ve achieved.

    Hats off to you if you have done something great, but I know that you haven’t achieved this:

    The system we are currently deploying, which we call Trinity, is a capability-class supercomputer specifically designed to provide extremely rapid and accurate approximate answers to arbitrarily large NP-complete problems.

    How do I know? Because there is a theorem that extremely accurate approximate answers to arbitrary NP-hard problems are just as difficult as exact answers. The most that you can hope for is extremely accurate answers to a few lucky NP-hard problems, and only vaguely accurate answers to most NP-hard problems.

    You seem to be aware that BQP is unlikely to contain NP. However, if you’re going to hype your stuff, it is not good enough to step back from things that you have learned are impossible, but then promise everything else. If you don’t respect the huge gap between “I can do it myself” and “I haven’t learned that it’s impossible”, then that is indeed irresponsible.

  13. Dave Bacon Says:

    Did you just call me a Chinese restaurant placemat? 🙂 Mmmmm…..Sweet and Sour Pork.

  14. Matt Leifer Says:

    Scott, I think you are underestimating the value of roast beef sandwiches for industrial optimization. If all staff in a company, aside from vegetarians, were given free roast beef sandwhiches for lunch it would almost certainly improve efficiency because so many employees, especially management, don’t stop for a proper lunch these days. I’d be willing to bet that the improvement would be at least an order of magnitude greater than anything that they could obtain from a D-Wave computer.

  15. rrtucci Says:

    I really missed my calling. If I had gone into science journalism instead of physics, I’d be considered a demigod by now. I can see what my NYT headline for the Orion Demo story would have been:
    St. Valentine’s day Massacre. Aaronson wilts Rose with one breath.

  16. Domber’s Basecamp » Blog Archive » Quantum Computer to solve NP-hard problems? Says:

    […] Read more about the “The Orion Quantum Computer Anti-Hype FAQ” here. I quote some, especially for me and you, interesting information here: Q: But couldn’t quantum computers try all possible solutions in parallel, and thereby solve NP-complete problems in a heartbeat? […]

  17. Scott Aaronson on the “Orion” Quantum Computer « Growth Rate n lg n Says:

    […] Scott Aaronson on the “Orion” Quantum Computer I haven’t really been following the press hype around D-Wave and their claims to have developed a quantum computer. Quantum Computing isn’t really part of my bio, but Scott Aaronson has an excellent FAQ up on his web-page that waters down much of the hype. […]

  18. Neratin Says:

    Funny, news release is already there…

    http://www.dwavesys.com/index.php?page=news

    “VANCOUVER, B.C. or MT. VIEW, CA – February 13, 2007 – The world’s first commercially viable quantum computer was unveiled and demonstrated today in Silicon Valley by D-Wave Systems, Inc., a privately-held Canadian firm headquartered near Vancouver.”

  19. RB Says:

    Neratin, do you know what time (and time zone) the demonstration will happen?

  20. Neratin Says:

    I have no idea – but it’s 8 am in California now (9 time zones away from the place I sit), so I guess it will happen in the next 8 hours 😉

  21. The Quantum Pontiff » D-wave In D-news Says:

    […] Scott Aaronson called me a Chinese restraraunt placemat in his The Orion Quantum Computer Anti-Hype FAQ. Doug Natelson, who gave an excellent talk here at UW a few weeks ago, poses three questions about the D-wave demo. Peter Rhode is every bit the skeptic and beats out Doug Natelson with four points. Ars-technica’s Chris Lee takes a shot at explaining adiabatic quantum computation and uses the word deathmatch here. You can find a bad quantum computing joke at the end of this blod post. I find this post amusing, if for nothing more than bringing politics into quantum computing. Coherence* remains the prettiest quantum computing website and has a choice Seth Lloyd comment “I’ll be a bit of a skeptic until I see what they have done. I’m happy these guys are doing it. But the proof of the pudding is in the eating.” […]

  22. matsu » Blog Archive » First Commercial Quantum Computer (?) Says:

    […] The Orion Quantum Computer Anti-Hype FAQ – Shtetl-Optimized […]

  23. Sujit Says:

    Hm, that’s frustrating. I registered at the link Altman provided above to view the video footage, and all I got was a generic messaging confirming that I’d registered and to please “visit again Tuesday for our new website.” It is Tuesday (and soon to be Wednesday, here in Philly!)

    Does anyone know how to view the video footage? (They really ought to put this up on YouTube…)

  24. Test riuscito ? Says:

    […] Alcuni dubbi li potete trovare da The Quantum Pontiff, un assistant professor del Quantum Computing Theory Group all’Università di Washington e da Scott Aaronson dell’Università di Waterloo. […]

  25. Roger Davenport Says:

    Found some videos of the presentation on Youtube…

    http://www.youtube.com/profile_videos?user=LUXVIBES

    Enyou!

  26. joe Says:

    And here it is again: in the latest press release from D-Wave, http://www.dwavesys.com/index.php?mact=News,cntnt01,detail,0&cntnt01articleid=4&cntnt01origid=15&cntnt01returnid=21, they claim:

    Quantum-computer technology can solve what is known as “NP-complete” problems. These are the problems where the sheer volume of complex data and variables prevent digital computers from achieving results in a reasonable amount of time.

    WTF.

  27. Scott Says:

    It sounds to me like the D-wave publicity people simply need to be hooked up to a mild electric current.

    “Quantum computer technology can solve what is known as NP-complete” — ZZZT — “certain NP-complete” — ZZZT — “can find approximate solutions to” — ZZZT — “alright, look, there’s this adiabatic algorithm, which can often get exponential” — ZZZT — “polynomial speedups…”

  28. Kurt Says:

    If D-Wave was a publicly traded company, they’d probably be a lot more careful about the kind of language they use. Right now they don’t have to worry about shareholder lawsuits. But if they don’t go out of business first, presumably at some point they will go public, and then they will be much more accountable for any ‘forward looking’ statements they make.

  29. The Quantum Pontiff » All D-wave, All the Time…P and S-wave’s Getting Jealous Says:

    […] Hoisted from DaOptimizer’s comments, YouTube videos of d-wave’s announcement. Also, Lieven Vandersypen (who factored 15, which sounds simple, but if you think the factoring experiment done on IBM’s NMR machine was simple, then you haven’t seen their pulse sequence!) weighs in on D-wave here. […]

  30. randform » Blog Archive » commercial quantum computers Says:

    […] For the readers convinience Scott Aaronson of Shtetl-Optimized also hands out “The Orion Quantum Computer Anti-Hype FAQ”. […]

  31. Mamling Says:

    Physics and journalists are an impedance mismatch. Decades ago journalists found great difficulty writing about the 77K superconductors without mentioning maglev trains.

  32. vzn Says:

    hi scott/all. I think rose is pretty brilliant & I agree with his “quick-and-dirty” approach and that it may be superior to academic approaches, which seem to me to have stalled over the last few years. nothing like the brutal capitalistic market, and venture capitalist sharks, both breathing down your next to produce quantifiable & even significant results.

    check out theory-edge for further discussion & links.

    http://groups.yahoo.com/group/theory-edge/

  33. Scott Says:

    vzn: I completely agree with you. When it comes to claims of having built scalable quantum computers, D-Wave leaves academics in the dust. It’s not that we don’t try — but in today’s fast-paced marketplace, trying to compete while weighed down by outdated concepts like “peer review,” “verifiability,” and “truth” is fighting with both hands tied behind your back.

  34. Ze Says:

    Scott : Nice work 🙂 (especially that last comment) I’ve got a computer science degree and your blog and the Quantum Pontiff are both interesting 🙂 , I may not be studying quantum computing (I’m more interested in machine learning and non quantum algorithms , including heuristics) but it’s always interesting to see whats going on within other areas of computer science.

  35. Geordie Says:

    Scott: I’m pretty sure I could take you even without your hands tied behind your back 🙂

    Simple question: Why are you so grumpy? We all love you out here in lotus land.

  36. Scott Says:

    Simple question: Why are you so grumpy?

    Geordie, probably the best way to answer that question is to quote from today’s Economist.

    On paper at least, quantum computers promise to reduce dramatically the time needed to solve a range of mathematical tasks known as NP-complete problems. One famous example is the travelling salesman problem–finding the shortest route between several cities. This is a puzzle that increases exponentially in complexity with the number of cities considered. The reason is that every possible permutation needs to be looked at in order to find the best.

    Quantum computers provide a neat shortcut to solving such problems. They do so by encoding all possible permutations in the form of a small number of “qubits”. In a normal computer, bits of digital information are either 0 or 1. In a quantum computer these normal bits are replaced by a “superposition” (the qubit) of both 0 and 1 that is unique to the ambiguous world of quantum mechanics … That means huge calculations can be done using a manageable number of qubits. In principle, by putting a set of entangled qubits into a suitably tuned magnetic field, the optimal solution to a given NP-complete problem can be found in one shot.

    I’ve spent pretty much my entire career trying to explain to people why the above picture of quantum computing is false — not trivially false, but deeply and profoundly false. The past few days have driven home how much impact it’s had. So, that’s one reason why I’m grumpy.

    A second reason is that, while you might be doing something interesting with superconducting qubits, it’s difficult to know without more details whether you’re even seeing a quantum effect at all.

    Alright, so let me ask you a question. You say you’re seeing a speedup in your experiments — but (1) how big of a speedup, and (2) do you mean compared to the best-known classical algorithms (like simulated annealing), or compared to brute-force search?

  37. Barbara Terhal Says:

    I would like to add some new critical perspective to D-wave’s implementation of quantum computation. D-wave is using Josephson junction qubits to implement adiabatic quantum computation. The physical Hamiltonian describing the qubits is most likely one that is described in terms of generalized position and momentum and the Hamiltonian consists of a potential term and a kinetic term in these variables. Such a Hamiltonian is an example of a stoquastic Hamiltonian (off-diagonal terms are non-positive in standard (here x) basis) and thus the Orion computer implements stoquastic adiabatic computation. There is evidence that stoquastic adiabatic computation is not universal for and weaker than quantum computation. The transverse Ising model (the one that is used to get to the ground-state of the NP-complete problem) is also stoquastic.
    In fact, we at IBM thought about the idea of implementing adiabatic qc with Josephson junction flux qubits and realized that, in order to implement universal qc, one has to do more experimentally in order to get a Hamiltonian that is non-stoquastic.
    See quant-ph/0611021 and quant-ph/0606140 for more details.

  38. Scott Says:

    Thanks, Barbara!

  39. Cody Says:

    No doubt, Barbara, that was an excellent response. I am vaguely versed enough in this field to understand what is going on here. As I understand it, to give one example, Peter Shor’s algorithm for factoring them large prime composites involves a few different notions:

    1) quantum computers can generate possible solutions to a given logic problem in parallel. These solutions occur with a certain probability that can be predicted using the Hamiltonian types of equations within the constraints of the specific type of quantum effect taking place.

    2) The algorithm for factoring large primes involves a step that takes advantage of a formula that is wave-like in nature to generate numbers quite near the prime numbers we want to discover.

    3) The quantum computing aspect makes this step linear, tractable, polynomial. Whereas without the quantum computing aspect, this step is non-polynomial.

    4) That said, there is a chance it -won’t- generate the right type of number here. However, the test of whether this is one of the right numbers is polynomial.

    5) There is still another set of steps to confirm and generate the actual primes, but they are also polynomial in nature.

    As you can see, I’m not 100% clear on this, and please correct what I have wrong. I have a mere bachelors in CS, enough math in that to be dangerous, a smattering of upper-division physics courses, and I’ve only digested a few books on quantum computing, and papers without the effort required to really follow the math.

    But what I’d like to see, if you can (check with your IP lawyers), why D-Wave’s system would not be able to process Shor’s Algorithm.

  40. Scott Says:

    large prime composites … The algorithm for factoring large primes …

    Oops! 🙂

    Your account of Shor’s algorithm is, frankly, a bit confused. If you want to learn more about it, try here, pages 66-79.

  41. Barbara Terhal Says:

    Cody: I don’t know of any way to cast Shor’s factoring algorithm as a stoquastic adiabatic computation, it would be quite nice if this were possible (if one tries to accomplish this via the `standard’ circuit-to-Hamiltonian construction then the Fourier transform gives rise to non-stoquastic terms). One can cast Grover’s search algorithm as a stoquastic adiabatic computation, see quant-ph/0206003.

  42. Dave Chapman Says:

    Ah, yes:

    >If D-Wave fizzles out, will many journalists and
    >policymakers then conclude that quantum computing is
    >bunk?

    One sign that a given research field is starting to heat up
    is that we get public announcements which are, ahem,
    controversial. I was doing research in the field of amorphous
    silicon solar cells in 1980-1982, and we had our fair share
    of people who published things which were very, very
    hard to duplicate.

    Amorphous silicon solar cells eventually turned out to be
    for real, and the industry currently employs a few thousand
    people.

    I would say that D-Wave’s behavior is disturbing, but not
    unexpected. It’s too soon to say whether cryogenic
    quantum computers are real, or if they’re useful.

  43. Scott Says:

    Dave: Thanks for the historical perspective!

  44. vzn Says:

    hi scott, you write:

    ===
    vzn: I completely agree with you. When it comes to claims of having built scalable quantum computers, D-Wave leaves academics in the dust. It’s not that we don’t try — but in today’s fast-paced marketplace, trying to compete while weighed down by outdated concepts like “peer review,” “verifiability,” and “truth” is fighting with both hands tied behind your back.
    ===

    I guess that is … sarcasm? envy? what? dwave has a fundementally different goal than academics. to some degree, academics primarily exist to write papers & develop theory. dwave exists to get a return on investment by selling a product. quite a contrast, eh? I am not saying they will succeed– I agree with comments that its very early to invest in qm computing with actual cash. I think qm computing will succeed, but its early.

    I set the bar low– if there is any qm behavior at all in their circuit, I would agree that they have produced a quantum computer.

    the computer theoreticians and qm theorists are saying, “we dont know if they actually have a computer” even as it sits there and does nontrivial calculations, almost certainly [imho] from at least “some” quantum effect. the creator georgie rose has a Phd, I believe, he is not a flimflammer. the scientists are measuring this by scientific measures such as peer review, when arguably, that is not a relevant measurement.

    scott, I like your FAQ, but I suspect you will have to rapidly revise it as new info becomes available and consensus emerges that dwave has really harness a qm effect [maybe not the exact theoretical embodiment of the mythical “qubit” of theoretical papers] & computed with it.

  45. Scott Says:

    scott, I like your FAQ, but I suspect you will have to rapidly revise it as new info becomes available and consensus emerges that dwave has really harness a qm effect

    I guess we’ll tunnel across that barrier when we come to it! 🙂

  46. dotbenjamin Says:

    “I set the bar low– if there is any qm behavior at all in [the] circuit, I would agree that [it is] a quantum computer.”

    Really? I guess I’m using one right now, then. 😉

  47. vzn Says:

    funny, but no, obviously the massive billions of dollars spent on fabrication plants for existing chips are designed specifically to, the greatest extremes, REMOVE quantum effects from the circuit. ie DIGITAL!=QM. zero or one only. every design goal of existing chip design is centered around getting that clean ‘0’ or clean ‘1’ without any noise. if DWave has a fundamentally non-digital element in their design, which is as they assert, and they are computing with it, it seems fair to classify that as quantum computing.

    but maybe scott would like to now write a paper– what constitutes “true” quantum computing? scott just told me that academics are interested in truth whereas I guess those outside of academia are not, wink

    ahem, I agree the computer is basically independently unverified at this point, however, and that it hasnt been peer reviewed in the stuffy multi-century year old sense of elizibeathen science socieities, but isnt 400 people at a demonstration in silicon valley in some kind of sense, “peer review”?

    suppose this event really is heralded in the history books as basically the 1st commerical demonstration of quantum mechanics. scott, would you like them to quote your FAQ exactly as its written? :p

  48. Scott Says:

    OK, how’s this for a working definition?

    A true quantum computer is any device that exploits quantum physics to solve some computational problem (possibly an oracle problem) asymptotically faster than that problem can be solved by a classical computer. Here “asymptotically” means “as the device is scaled up in size in a straightforward way.”

    Sure, I’m an arrogant Elizabethan academic in my ivory tower, but I’ll accept that D-Wave has created a quantum computer in the above sense as soon as I see convincing evidence. Am I wrong to ask for evidence? Am I wrong to think the burden of proof lies with them, not with me? Wouldn’t it actually be disrespectful to D-Wave not to hold them to the same evidentiary standard to which I’d hold anyone else?

    Any time I say anything, obviously I run the risk that what I’ve said will look foolish in the light of history. What you neglected to mention is that the same applies to you.

  49. vzn Says:

    ===
    A true quantum computer is any device that exploits quantum physics to solve some computational problem (possibly an oracle problem) asymptotically faster than that problem can be solved by a classical computer. Here “asymptotically” means “as the device is scaled up in size in a straightforward way.”
    ===
    I disagree with the definition but feel that this is a highly commendable exercise. for the scientists who are “skeptical”, what exactly constitutes quantum computing? would they know one when they saw it? how do you measure or quantify whether it is doing “true” quantum computing? what is “true” quantum computing?

    maybe we can at least hammer out some kind of definition of a qm computer that everyone, skeptics and believers alike, can agree on. [ps Im not kidding scott– I would LOVE to read a paper by you defining, what truly constitutes quantum computing]

    my disagreement is that there is nothing intrinsic to qm computation that requires that it run faster than classical/conventional approaches. of course we DESIRE that, but isnt that an unnecessary part of the definition?
    and what about “oracle”? isnt that completely irrelevant to quantum computing? can you compute an oracle problem with a classical computer? no? then why should a qm computer solve it also?

    I will ponder this some more & revisit the site & try to hone my own defn of qm computing, because I admit, it is not a clearcut thing, it is slippery, esp given that apparently to date, none has been created by some standards.

    imho, chuang has already created a working qm computer on the academic side, the NMR one that factors 15, I would give him that historical credit.

  50. Scott Says:

    vzn: I’m glad we’re actually addressing some concrete issues now.

    The reason I asked for an asymptotic speedup in my definition is that that’s what D-Wave itself explicitly wants (and claims?) to achieve!

    Obviously, there are lots of reasons to study systems of interacting qubits that have nothing to do with computational complexity speedups: QKD, Bell inequalities, etc. But if not for the possibility of asymptotic speedups, I don’t know why anyone would study quantum computing.

    As for oracle problems: letting them count for an asymptotic speedup only makes my definition less stringent, not more! See, I’m just trying to help D-Wave out a bit by lowering the bar…

  51. Shtetl-Optimized » Blog Archive » Speaking truth to parallelism Says:

    […] Update (2/16): If you read the comments, Geordie Rose responds to me, and I respond to his response.  Also see the comments on my earlier D-Wave post for more entertaining and ongoing debate. […]

  52. Luxvibes Says:

    I filmed this event, i put it up, should I post the links etc?

  53. jhl Says:

    It seems that the disconnect here has two components. First, the eternal problem of marketing hype muddying practical discussion. Second, the question of expectations as to what constitutes a meaningful breakthrough in quantum computing.

    As to the first, well it’s annoying but it’s more or less the way of the world. This was obviously a publicity event aimed at generating VC, and the character of the presentation reflects that. That’s not to say that they aren’t doing real work, or wont have a useful product. The marketing hype has little to do one way or another with weather or not this method will scale up to anything useful. I am somewhat optimistic, in that all the comments I’ve read from Rose are far less expansive and more grounded in reality.

    To the second, clearly this isn’t the Holy Grail of qm computation. However, that’s a long way from saying that they won’t produce something useful from this line of research. If two years from now they’ve got a working Quantum Coprocessor (look, I coined a new term 😉 ) that can make a meaningful (if not asymptotic) speedup in business or scientific computing tasks while costing less than simply building a bigger conventional cluster, then they will have succeeded. There’s a few significant “if’s” there, and Rose has acknowledged them. It’ll either scale up well or it wont, and they won’t know for sure until they get there.

    Time will tell.

  54. Scott Says:

    Luxvibes: Sure!

  55. Fox Diller Says:

    Here, I took some pictures of the yawnable but interesting demo they had in Vancouver. Sorry for the lighting, I didn’t want the flash to upset the comptuer.

  56. Fox Diller Says:

    http://www.flickr.com/photos/cstudio/sets/72157594538334165/

    Oops, I can’t spell Computer, and post a link.

  57. vzn Says:

    a few more thoughts: I agree this is not a full fledged quantum computer, its clearly a TOY quantum computer, giving dwave the benefit of the doubt for the moment it has a quantum effect/component. that would be obvious to anyone who is familiar with the field. but it is also implicit in their claims– they are talking about making the system available for experimentation by researchers.

    sure, it would be great if they emphasized to reporters that its a TOY quantum computer, but it also interferes with the marketing. look, I hate marketing hype as much as the next guy. but its “business as usual” for silicon valley. its more a product launch than a scientific event, but has scientific significance if the claims are fully confirmed by researchers from the field, which is still pending.

    lets not change the bar however, or move the target. at this point in time, even a toy quantum computer is a theoretical and practical scientific breakthrough. its an “upset”. a commercial firm that seems to be farther along than the academic labs (from what I can tell as of this nanosec), eg NIST, who are spending milliions of govt dollars to materialize QM computer designs that possibly emphasize (maybe now in retrospect, slightly misguidedly) precise control over qubits instead of “quick-and-dirty”

    I continue to disagree with scotts demand for inherent scalability of any quantum computer. we could possibly have limited-use qm computers that have scaling problems in the future. I agree scalability is an absolutely foremost concern for the designers, and the whole issue of practicality of qm computing hinges on scalability. is there such as thing as an “impractical” computer? well yeah, when computers were 1st invented, they were unweildy and largely unpractical, but they were still called COMPUTERS. qm computing will follow exactly the same format! no surprise there.

  58. vzn Says:

    ps, re quantum computing, I think the word to emphasize informally is “spintronics” which has been bounced around the media a bit. something that seems to take advantage of electron or molecular spin rather than total electron quantity. maybe there are some qm effects that are not exactly “spintronics” but its a useful word for the media. maybe there would be a way to define “spintronics” more exactly.

    as I was thinking about it, there are some definitions of qm computing that could possibly be so broad as to include analog computing. Id say a good definition would exclude analog computing somehow.

  59. vzn Says:

    more thoughts: arguably, the vacuum tube is not a scalable computer component, esp in retrospect & compared with silicon, yet it was the basis for the birth of modern computing. arguably, all computer components/technologies start out as unscalable. designs become “possible” before they become “practical” and “practical” before “scalable”. does dwave have a SCALABLE qm technology? yes, they are claiming that. I would say that is as-yet undemonstrated & remains to be seen.

    the fact that they used conventional chip fabrication techniques suggests/supports that their design is indeed scalable. moreover the planar NSEW connection topology [seen in photos of the chip] is definitely scalable in theory.

    this is a shot of the NSEW (north south east west) topology. I have not seen this discussed in a qm paper, does anyone have a reference? its quite ingenious imho.
    http://dwave.wordpress.com/files/2007/02/20070110_d-wave_orion_processor.JPG

    notice they are apparently NOT trying to move qubits around the chip (adiabatic approach), which is the NIST approach, and possibly far more complex & requiring much better control.

    in contrast, this is the NIST paper on ion trapping/shuttling which is arguably the leading/foremost/farthest-along academic approach to qm computing, & which (I understand) has been funded with millions of dollars of govt research money thru NIST. precise ion trapping is demonstrated with long precedent, but (as I undestand) ion shuttling is a bit more unimplemented so far.

    http://tf.nist.gov/general/pdf/1566.pdf

  60. Luxvibes Says:

    sorry about the delay. I have even more footage and an adhoc interview
    with the CEO, Geordie and 2 other engineers.

    http://video.google.com/videoplay?docid=8033280380582597891&hl=en

    and

    http://video.google.com/videoplay?docid=-6877849178932916346&hl=en

  61. ovolko Says:

    “quantum computers couldn’t solve NP-complete problems in polynomial time”

    It was quite surprising to me. Is there any proof of that?

  62. vzn Says:

    this site does a small disservice in attempting to set the record straight, actually muddying it. it is an OPEN PROBLEM in complexity theory exactly to what degree quantum computers can solve problems faster than classic approaches. it is not fully proven either way.

    it is CONJECTURED [by most complexity theorists/ experts] that qm computing is no more powerful than classical computing w.r.t the P=?NP problem. the site writing by scott says “conjectured” in one place but loses the distinction in two places, at the top of the page and in talking about dwave systems. the statement,
    ==
    Quantum computers cannot solve NP-complete
    problems in polynomial time.
    ==
    is an assertion that is not yet proven or disproven. ie if it were false [& qm computers could solve NP problems in P time], that would be very surprising, but not directly contradict any current knowledge.

  63. Scott Says:

    Is there any proof of that?

    No, but see my next post…

  64. roland Says:

    a shot in the dark: next post’s title will be

    ten reasons to believe: NP not in BQP

  65. Richard Gill Says:

    “I set the bar low– if there is any qm behavior at all in [the] circuit, I would agree that [it is] a quantum computer.”

    I want to know if there is Quantum Entanglement between ALL the 16 Qbits and if it the answer to the computation would have been wrong without it. Then I would be impressed. Then it would be an engineered quantum computer; in my opinion; I would even say: the first one ever… (factoring 15 with NMR doesn’t count since this was about 10 to the 23 quantum computers, each on their own completely useless. And I’m not much interested in less than 10 qubits anyway). (And I don’t count the universe as an engineered quantum computer)