Collaborative Refutation

At least eight people—journalists, colleagues, blog readers—have now asked my opinion of a recent paper by Ross Anderson and Robert Brady, entitled “Why quantum computing is hard and quantum cryptography is not provably secure.”  Where to begin?

  1. Based on a “soliton” model—which seems to be almost a local-hidden-variable model, though not quite—the paper advances the prediction that quantum computation will never be possible with more than 3 or 4 qubits.  (Where “3 or 4” are not just convenient small numbers, but actually arise from the geometry of spacetime.)  I wonder: before uploading their paper, did the authors check whether their prediction was, y’know, already falsified?  How do they reconcile their proposal with (for example) the 8-qubit entanglement observed by Haffner et al. with trapped ions—not to mention the famous experiments with superconducting Josephson junctions, buckyballs, and so forth that have demonstrated the reality of entanglement among many thousands of particles (albeit not yet in a “controllable” form)?
  2. The paper also predicts that, even with 3 qubits, general entanglement will only be possible if the qubits are not collinear; with 4 qubits, general entanglement will only be possible if the qubits are not coplanar.  Are the authors aware that, in ion-trap experiments (like those of David Wineland that recently won the Nobel Prize), the qubits generally are arranged in a line?  See for example this paper, whose abstract reads in part: “Here we experimentally demonstrate quantum error correction using three beryllium atomic-ion qubits confined to a linear, multi-zone trap.”
  3. Finally, the paper argues that, because entanglement might not be a real phenomenon, the security of quantum key distribution remains an open question.  Again: are the authors aware that the most practical QKD schemes, like BB84, never use entanglement at all?  And that therefore, even if the paper’s quasi-local-hidden-variable model were viable (which it’s not), it still wouldn’t justify the claim in the title that “…quantum cryptography is not provably secure”?

Yeah, this paper is pretty uninformed even by the usual standards of attempted quantum-mechanics-overthrowings.  Let me now offer three more general thoughts.

First thought: it’s ironic that I’m increasingly seeing eye-to-eye with Lubos Motl—who once called me “the most corrupt piece of moral trash”—in his rantings against the world’s “anti-quantum-mechanical crackpots.”  Let me put it this way: David Deutsch, Chris Fuchs, Sheldon Goldstein, and Roger Penrose hold views about quantum mechanics that are diametrically opposed to one another’s.  Yet each of these very different physicists has earned my admiration, because each, in his own way, is trying to listen to whatever quantum mechanics is saying about how the world works.  However, there are also people all of whose “thoughts” about quantum mechanics are motivated by the urge to plug their ears and shut out whatever quantum mechanics is saying—to show how whatever naïve ideas they had before learning QM might still be right, and how all the experiments of the last century that seem to indicate otherwise might still be wiggled around.  Like monarchists or segregationists, these people have been consistently on the losing side of history for generations—so it’s surprising, to someone like me, that they continue to show up totally unfazed and itching for battle, like the knight from Monty Python and the Holy Grail with his arms and legs hacked off.  (“Bell’s Theorem?  Just a flesh wound!”)

Like any physical theory, of course quantum mechanics might someday be superseded by an even deeper theory.  If and when that happens, it will rank alongside Newton’s apple, Einstein’s elevator, and the discovery of QM itself among the great turning points in the history of physics.  But it’s crucial to understand that that’s not what we’re discussing here.  Here we’re discussing the possibility that quantum mechanics is wrong, not for some deep reason, but for a trivial reason that was somehow overlooked since the 1920s—that there’s some simple classical model that would make everyone exclaim,  “oh!  well, I guess that whole framework of exponentially-large Hilbert space was completely superfluous, then.  why did anyone ever imagine it was needed?”  And the probability of that is comparable to the probability that the Moon is made of Gruyère.  If you’re a Bayesian with a sane prior, stuff like this shouldn’t even register.

Second thought: this paper illustrates, better than any other I’ve seen, how despite appearances, the “quantum computing will clearly be practical in a few years!” camp and the “quantum computing is clearly impossible!” camp aren’t actually opposed to each other.  Instead, they’re simply two sides of the same coin.  Anderson and Brady start from the “puzzling” fact that, despite what they call “the investment of tremendous funding resources worldwide” over the last decade, quantum computing still hasn’t progressed beyond a few qubits, and propose to overthrow quantum mechanics as a way to resolve the puzzle.  To me, this is like arguing in 1835 that, since Charles Babbage still hasn’t succeeded in building a scalable classical computer, we need to rewrite the laws of physics in order to explain why classical computing is impossible.  I.e., it’s a form of argument that only makes sense if you’ve adopted what one might call the “Hype Axiom”: the axiom that any technology that’s possible sometime in the future, must in fact be possible within the next few years.

Third thought: it’s worth noting that, if (for example) you found Michel Dyakonov’s arguments against QC (discussed on this blog a month ago) persuasive, then you shouldn’t find Anderson’s and Brady’s persuasive, and vice versa.  Dyakonov agrees that scalable QC will never work, but he ridicules the idea that we’d need to modify quantum mechanics itself to explain why.  Anderson and Brady, by contrast, are so eager to modify QM that they don’t mind contradicting a mountain of existing experiments.  Indeed, the question occurs to me of whether there’s any pair of quantum computing skeptics whose arguments for why QC can’t work are compatible with one another’s.  (Maybe Alicki and Dyakonov?)

But enough of this.  The truth is that, at this point in my life, I find it infinitely more interesting to watch my two-week-old daughter Lily, as she discovers the wonderful world of shapes, colors, sounds, and smells, than to watch Anderson and Brady, as they fail to discover the wonderful world of many-particle quantum mechanics.  So I’m issuing an appeal to the quantum computing and information community.  Please, in the comments section of this post, explain what you thought of the Anderson-Brady paper.  Don’t leave me alone to respond to this stuff; I don’t have the time or the energy.  If you get quantum probability, then stand up and be measured!

179 Responses to “Collaborative Refutation”

  1. Mateus Araújo Says:

    I think the issue is that most of us don’t have eight people nagging us to comment on obviously wrong papers, so we just do what you probably want to do: ignore them.

  2. Ashley Says:

    I’m not sure if I have anything further than the arguments you’ve made already to say about the main message of the paper. But here’s a comment on a more trivial point: at the end of Section 1, the authors claim that “Researchers are now starting to wonder whether geometry affects entanglement and coherence; the fi rst workshop on this topic was held last year”.

    I attended the workshop they cite, which was emphatically not about this issue. As can be seen from the schedule, the workshop was concerned with mathematical questions to do with the geometry of quantum states in the standard Hilbert space picture of quantum mechanics, rather than any physical issues concerning whether the underlying qubits were arranged in a linear trap, etc.

  3. Scott Says:

    Mateus and Ashley: Thanks!!

  4. Douglas Knight Says:

    it’s ironic that I’m increasingly seeing eye-to-eye with Lubos Motl—who once called me “the most corrupt piece of moral trash”

    From Lubos Motl, that’s high praise. Are there even 100 living people he has described in such positive terms? So it’s not at all ironic that would bias you to agree with him.

  5. Alex Says:

    Just ignore them. I have to admit I have great fun watching you debunk Joy Christians and the likes, but in the end, it is a futile and time-consuming task. It’s best just to ignore them and enjoy our little treasures. They’re so much more interesting!

  6. Anon J. Mouse Says:

    Unfortunately, I think it’s an example of this phenomenon:

    http://www.smbc-comics.com/index.php?db=comics&id=2556

    Ross Anderson is quite well known in computer security, mainly for his work on banking security. In fact, if you hadn’t written this piece I probably would have been the ninth person to ask you about it — it was clearly wrong, but from someone without a prior history of crankiness.

    It’s always disappointing when someone you respect does something silly like this.

  7. Scott Says:

    Anon J. Mouse #6: Thanks! I intentionally didn’t look up who Anderson and Brady were before writing the post, since I didn’t want that to bias me. But, yes, the amount of respectful attention this obviously-wrong paper seemed to be getting did surprise me.

    Incidentally, I just found Anderson’s blog, which includes a comment by Jonathan Oppenheim making substantially the same points as in my post.

  8. Greg Kuperberg Says:

    There is another point to make about scientific revolutions such as Newton’s laws or quantum mechanics. Yes, they can be supplanted or revised, but it’s not as if there is any turning back. The new revolution is almost always even less palatable to the old guard than the old one.

    (OTOH I do not understand the statement that BB84 does not use entanglement. I guess that there are non-entanglement versions, but it looks like the original BB84 does use entangled Bell pairs.)

  9. Scott Says:

    Greg #8: No, BB84 just requires sending individual, unentangled qubits in one of the four states |0⟩, |1⟩, |+⟩, or |-⟩. (Indeed, the lack of any need for entanglement is one of the main reasons why the protocol is already practical today.) Interestingly, I understand that many of the security proofs for BB84 introduce entanglement as a formal convenience, but the entanglement never appears in the actual protocol itself.

  10. Greg Kuperberg Says:

    How about that! I skimmed the link and totally misread it. Of course you’re right.

  11. Yury Says:

    “Like monarchists or segregationists, these people have been consistently on the losing side of history for generations…”

    Unfortunately, I am not sure that monarchists or segregationists have been consistently on the losing side… At least, those who are against democracy and human rights often win — just look at Ancient Greece, Ancient Rome, independent Italian City-States, many democratic European governments in the first half of the 20-th century (which lost to dictatorships), or modern Russia (which is now much less democratic than it used to be 15 years ago). Also I’m afraid that now science is losing to pseudoscience. Most people in the world believe that “evolution is just а [wrong] theory,” “Big Bang never happened,” homeopathy works and biofields exist.

    It is surprising, however, that this paper attracted that much attention. I try to ignore articles that are based on clearly wrong assumptions like Quantum Mechanics or General Relativity is wrong, evolution never happened etc. Why do people pay attention to such papers 🙁 ?

    Thanks for an interesting post!

  12. Scott Says:

    Yury #11: Good point! I should have clarified that monarchists, segregationists, and anti-QM folks have been consistently on the intellectually losing side. But they can win plenty of shallow “victories,” just as con-men or street thugs can.

  13. Luboš Motl Says:

    LOL, Scott. I am sure your thinking engines are good enough to see eye-to-eye with me.

    By remembering my unflattering quote, you’re also showing some sense of history and long-term memory.

    http://motls.blogspot.com/2006/12/academia-and-scientific-integrity.html?m=1

    More than 6 years ago, the unflattering words were revealed because of your decision to write anything (including any lie) about quantum gravity and string theory that someone pays you for.

    I don’t know whether this decision to be bought still holds. If it does, my assessment of course still holds as well whether or not we see eye-to-eye with one another.

  14. Scott Says:

    Lubos says he’s sure my “thinking engines” are good enough to see eye-to-eye with him! Callooh! Callay! This might be the single greatest compliment I’ve ever received.

    And Lubos, in return for your generous compliment, I have some good news. As a result of major life changes—getting married, having a baby, etc.—I have abandoned my previous materialistic, money-grubbing ways. I’m now strictly a man of principle. And as such, no amount of money could ever induce me to abandon my total, principled commitment to Loop Quantum Gravity.

    OK, OK, I’m kidding about the last part. In fact, I have a much better appreciation now for the achievements of string theory than I did back in 2006, partly due to a meeting in Florence where Brian Greene spent 4 hours explaining them to me and others. I came away genuinely impressed, convinced that string theory and especially AdS/CFT are unequivocally a step forward in our understanding of the universe, even though we have a great deal more to learn. I’m not ready to say that alternative ideas like LQG are garbage and have nothing worthwhile to contribute, let alone that global warming is a sham, but maybe Lubosification is a process that will happen to me one step at a time. 🙂

  15. Greg Kuperberg Says:

    I suspect that LQG isn’t absolute garbage either, and I also cannot compare it to string theory on any direct authority either. But speaking as an external observer, it really does look like LQG is (or maybe was) a Nader-like quest to compete with string theory. I.e., a quest in which simply being invited into the debate is the first major goal, even if it has no good consequence or even negative consequences for the actual outcome of the debate.

    Of course, also speaking as an external observer, global warming looks like anything but a sham, in fact global warming denial looks like a sham.

  16. Scott Says:

    Greg #15: My understanding of string theory is that it’s “what you’d inevitably come up with” if you took quantum field theory and perturbation theory as your fundamental starting points, then tried to tame divergences by replacing the point particles by extended objects. When you do that, you get some wonderful things that weren’t explicitly put in (e.g., the graviton), but also various aspects that seem to require ugly kludges to make them consistent with observed reality. Meanwhile, my understanding of LQG is that it’s “what you’d inevitably come up with” if you took GR and its demand for background-independence as your fundamental starting points, and tried to create a quantum theory satisfying that demand while leaving aside the details of particle physics. When you do that, you get something wonderful that wasn’t explicitly put in (spacetime discreteness), but also various aspects that seem to require ugly kludges to make them consistent with observed reality.

    If I’m right, then despite their incompatibility both with each other and (probably) with the ultimate truth, neither string theory nor LQG is nearly as “arbitrary” as they might seem to an outsider. If we had to pick one of the two that’s had more technical successes, that would be string theory, but that doesn’t mean LQG has had no technical successes.

  17. Greg Kuperberg Says:

    Scott – Except with one colossal difference: One is conjectured to be mathematically viable, and the other one is conjectured not to be.

  18. Greg Kuperberg Says:

    To be more precise, superstring theory satisfies a formidable array of mathematically rigorous consistency checks. It is either entirely or very nearly rigorously defined as a perturbative model of quantum gravity (in 9+1 dimensions). There are a ton of technical mathematical successes.

    Whereas with LQG, I’m not sure that there really are any technical successes. Some consistency checks have been claimed/published, but there are arguments that they are all superficial. I have also heard that renormalization theory speaks against the viability of a macroscopic limit of LQG, although I don’t know a whole lot about it.

  19. Ajit R. Jadhav Says:

    Yesterday, when I first saw this post, not a single comment had yet made an appearance. The only thing I could think of, by way of a reply, was the following old one which I happened to remember, wanted to write down as my answer, but, somehow, didn’t (I decided to wait for other comments to appear, first). Anyway, the joke goes:

    Masochist to sadist: Oh, please, please, hit me!
    Sadist to masochist: No, I won’t!

    Ok, I will try to read that paper. … Yeah, in a way, I have known of that dancing droplet thingie since the time it came out, and wasn’t impressed much (if at all) by it. … Anyway, Scott’s point #2 and #3 seem to be right on.

    Ajit
    [E&OE]

  20. Mkatkov Says:

    Scott: …
    Lev Tolstoy: Every happy families are similar, whereas any unhappy family is unhappy in a different way.

    Charles Babbage: A different kind of physics was discovered (and was actually necessary ) before the classical computers became practical.

  21. Quantum Cowboy Says:

    I have to admit, I was pretty shocked to see this paper, and it makes you wonder about this guy. I get crackpots occasionally emailing me with their theory of how Bell’s theorem is wrong, or relativity is wrong, or quantum computation/crypto is wrong. But rarely do you find a claim of all three! Plus the mention of Bohemian mechanics and black holes in a quantum paper, pretty much tick five of the seven crackpot boxes. The only thing that’s missing is that it be written in several different fonts, and start with an introduction about how the author is a misunderstood genius and that mainstream scientists are too conservative to understand the truth of his theory, but that just like Einstein, he’ll be proven right in the end.

    People like Ron Rivest were hyping up this thing. Amazing!

  22. Henning Dekant Says:

    Scott #16 a very nice summation of the String and LQG efforts. I will probably have to quote this at some point.

    As to the paper, I believe there is something intrinsically sinister going on when people try to approach QM from a topological angle. Apparently it makes people get all wobbly in the head. How else to explain Joy Christian and now these apparently respectable fellows, pushing these rather strange ideas?

  23. Ajit R. Jadhav Says:

    Oops. In my above reply, please take it as #1 and #2 (of Scott’s points). Those two seem stronger points than #3 to me, with #2 being the strongest. (Above, I just got the point numbers wrong). … Sorry about that.

    … BTW, whenever any new view or theory to resolve the quantum riddles is put forth, I invariably end up wondering how a simple computational model simulating the essential physics of it might look like. Ditto here.

    The simulation being presented here, if it can be called that, is physical. That, in part, is a problem: with this kind of a model, the authors don’t have to directly dvelve in detail how the boundary/initial conditions are to be handled. Writing a C++ program would force them to be explicit with respect to all such details.

    If such a program is presented, then our (my!) task will become that much easier: the only remaining task will be to examine the points on which the conventional (say the Copenhagen) view and the new view differ.

    By its nature, a C++ simulation will have to capture the new concepts (objects/classes), and make them work in a new way (methods/algorithms). This nature of the computational simulation forces one to be explicit about the theoretical content.

    In contrast, what is presented as a physical simulation need not be so directly concerned with presenting the new theoretical ideas in their completeness; it could get away with being rather suggestive.

    Ajit
    [E&OE]

  24. Luboš Motl Says:

    To be sure that you don’t miss my comments on your Lubošification, see

    http://motls.blogspot.com/2013/02/lubosification-of-scott-aaronson-is.html?m=1

  25. Tyler Says:

    Although I still agree with the issues raised here on the Ross Anderson and Robert Brady paper (ie, I think quantum computing and quantum crypto will be viable in the future), I don’t think their argument requires that BB84 use entanglement.

    If I understand correctly (I only did a quick read, so that’s a big “if”), they are against even the Bell inequalities. This would of course allow for a local hidden variable approach which would destroy quantum information, as it would reduce it to classical information. If this were the case, then it may be possible to use some crazy measuring device to measure the signals in the BB84 protocol in any basis, without disturbing the original signal.

    Now, I am pretty sure that the Bell inequalities do hold true, and so quantum crypto is still secure. I don’t really understand how they are claiming there is a flaw in the logic of the Bell inequalities. I just thought in the interest of fairness that point should be brought up.

  26. Scott Says:

    Tyler #25: Fine, but that would require a completely different argument than the one they actually made!

    If they said: we think QM is completely wrong, ergo Eve can violate the No-Cloning Theorem, ergo she can break BB84, then their starting premise might still be bollocks, but at least steps 2 and 3 would follow logically.

    Instead they said: “As the experiments done to test the Bell inequalities have failed to rule out a classical hidden-variable theory of quantum mechanics such as the soliton model, the security case for quantum cryptography based on EPR pairs has not been made.” Nowhere in the article do they indicate any awareness that the main QKD schemes are not based on EPR pairs, and indeed the paper title says “…quantum cryptography is not provably secure,” with no mention of EPR pairs. So this seemed like a case of straightforward confusion.

  27. David Speyer Says:

    What’s striking to me, as a non-physicist, is that they never discuss why the Bell experiments obtained the correlations predicted by orthodox quantum mechanics. They explain a loophole by which a local theory could be consistent with the experimental results. But (on a quick skim) I see no recognition of how amazing it is that this local theory manages to recreate the precise predictions of the nonlocal theory.

  28. Robert Brady Says:

    I am afraid Haffner doesn’t report 8 qubits that could be used in quantum computation. Most of the ions are in the unexcited (ground) state, but the usual procedure in many-body theory is to subtract out the ground state. This can also be understood in terms of the modes of oscillation of multiple coupled oscillators, referred to in the paper by Ross and myself, where most or all of the oscillators act coherently as a single entity corresponding to the ground state.

    A similar effect occurs in superconductors, where I did theoretical and experimental work as a fellow of Trinity College alongside Brian Josephson. The phase difference across a Josephson junction between two bulk superconductors can represent at most one qubit. This is true even though the superconductor contains very many entangled electrons. Brian’s original work describing this phase transition is available here. You can of course analyse the same system as if it contained a large number of qubits, one for each pair of electrons, as long as you remember they are not independent. But nobody considers counting them individually, as if there were millions of qubits for quantum computation, because they are all correlated with a single phase and act as a single entity.

  29. Scott Says:

    Robert Brady #28: Your response reminds me of the creationists who state categorically that “there are no missing links.” Then when people say, “What about Australopithecus? What about Homo erectus? etc.,” they reply, “well, that one’s really a human. That one’s really an ape. Neither one is transitional between the two.” Since they get to make up the rules as they go along, there’s no way they can ever be proven wrong.

    In a similar way, you claim that there’s no evidence for entangled states of more than 3 or 4 qubits. Then people immediately respond, what about this experiment? What about that one? In each case, you have some a posteriori reason why that experiment doesn’t count: yes there are hundreds of particles in a cat state, but they’re not behaving independently so they don’t count as “qubits.” And what about, say, the “cluster-like” quantum states discussed in this paper, which involve many-particle entanglement not in a collective cat-like degree of freedom? I assume you have some other reason why those don’t count.

    What you need, and don’t have (as far as I’ve seen), is a theory that would explain a priori what sorts of many-particle entanglement would count.

  30. Ross Anderson Says:

    Scott, when we exchanged email privately before you made this blog post, I asked you to point me to any experimental paper that challenged the soliton theory of the electron. Rather than continuing that discussion in a civilised fashion, you chose to post this rant instead, inviting your followers to be abusive. But as Robert points out, the paper you cite does not report eight qubits at all. I’ve had similar conversations by email with other scientists who’ve privately pointed out other papers claiming multiple qubits; but most have already been attacked by other workers in the field, as here.

    I’m saddened that your response to Robert’s post was simply abusive. Anyone who’s interested in a substantive discussion of this issue may find it on light blue touchpaper, and more generally at the.Emergent Quantum Mechanics workshop.

  31. Joe Fitzsimons Says:

    I’m a little confused about the focus on that particular eight qubit experiment. There are a number of eight qubit experiments out there, particularly in quantum optics (see for example Jian-Wei Pan’s experiments).

  32. Scott Says:

    Joe #31: Thanks! I mentioned the Haffner et al. experiment only because it was the first >4-qubit experiment not in liquid NMR that popped into my head. You’re right that I also could’ve mentioned optical experiments, but I’m pretty sure it wouldn’t have mattered. Anderson and Brady are playing the game where first someone else proposes a many-qubit experiment—any such experiment—then they think up a creative reason why it doesn’t contradict their model. They’ve left the realm of Popperian falsifiability.

  33. Luboš Motl Says:

    Dear Scott,

    I don’t believe it’s possible for a well-defined model to avoid falsification in this way. At most, they may have a vague template of a model superimposed on tons of wishful thinking that looks compatible with their not-too-comprehensive consistency checks.

    However, it’s totally obvious that if one proposes any particular model that is “qualitatively inequivalent” to proper quantum mechanics, it has to contradict some experiments that are well-known and easy to do. In fact, one may eliminate whole, almost complete classes of these QM-inequivalent models.

    For example, if they admit that all possible states of 2 electrons are described correctly by quantum mechanics (including their entanglement), and one imposes any kind of mild locality or Lorentz symmetry which is tested in many ways as well, it’s clear that the laws for 2 electrons may only be extended in a unique way to an arbitrary number of electrons simply because any subset of 2 electrons in the larger set has to agree with quantum mechanics.

    It’s also silly to say that we can’t study any states with more than 4 entangled qubits in Nature. Take any molecule such as benzene – I could pick pretty much any other molecule but I want to be specific. It has a hexagon of carbon atoms. Each carbon atom has 4 valence electrons to share; for each carbon atom, 1 of these 4 is attached to a hydrogen atom – organized to a hexagon radially attached to the carbon hexagon.

    This leaves 3 free valence electrons for each carbon atom. Each carbon atom is connected to the neighboring carbon atoms – the bond is “double” for one of the carbon atoms and “single” for the other one. In total, we have a configuration of at least 6 qubits here. The low-lying states of benzene distinguish two states, the worth of 1 qubit – because the double and single bonds have to alternate and there are just two options. Indeed, the sum and differences of these two alternating arrangements give two energy levels of the molecules we may test.

    There’s one qubit visible at low energies except that any model that denies that these are built from a larger number of electrons/qubits that may be organized/entangled in many other ways a priori will contradict the locality to the extent that it will really be incompatible with the atomic theory of matter itself! 😉 It’s nothing else than the atomic theory that implies that the properties of molecules are constructed of – and subsets of – properties of the collections of atoms from which the molecules are built.

    There just can’t possibly be any model that would get the right low-energy levels of the benzene molecule while it would deny the a priori existence of many electrons storing an arbitrary number of qubits of quantum information a priori. If someone thinks that their model can describe such things, he should write a paper about the description of the molecule in a “completely different way” than QM.

    It’s not just the benzene molecule. It’s any molecule. It’s any system with many particles. Condensed matter physics gives a whole new perspective on this issue. The people who don’t use the regular multiparticle quantum mechanics and/or quantum field theory (or its upgrade, string theory, and all these three frameworks are really equivalent when it comes to the analysis of these low-energy states) are really abandoning modern physics. To claim that they have an alternative, they have to start from scratch and they have to offer their – totally different – explanation for every single observation in modern science.

    It won’t be enough to describe one particle or one force because they are apparently messing with the very way how composite systems and interactions are constructed out of the smaller ones, too. So they have to separately check whether larger, composite systems according to their theory behave in agreement with experiments, too. And of course that the answer is a resounding No.

    What Dr Anderson and Dr Brady do is typical pseudoscience in which one decides that the right theory must be destroyed and declared wrong and they construct an alternative except that they set much lower standards for the alternative and don’t even try to check whether the alternative is capable of describing at least 1% of the things that the right theory is able to describe, at least the elementary things.

    Cheers
    LM

  34. Greg Kuperberg Says:

    Since Anderson requests a serious discussion instead of ridicule, here is one: Reference [36] in Anderson-Brady reports a violation of the Bell-CHSH inequality under quite strict conditions, and the rebuttal given to that finding is incomprehensible.

  35. Slava Kashcheyevs Says:

    Brady seems to be simply misguided about the basics of quantum physics. At behest of my quantum computation colleague I’ve posted a bit more detailed critique:

    http://seeking-mind.blogspot.com/2013/02/no-threat-to-quantum-cryptography-at.html

  36. Andris Says:

    I asked a physicist colleague (Vyacheslavs Kaschcheyevs – the best quantum theorist in our country) for his opinion on this paper. Here are his comments:
    http://seeking-mind.blogspot.com/2013/02/no-threat-to-quantum-cryptography-at.html
    The conclusion is the same as Scott’s but Vyacheslavs also points out that Anderson&Brady ideas are refuted not only by quantum computing experiments with more than 3 qubits but also by a large number of quantum physical phenomena which have been tested long before quantum computing was invented.

  37. John Says:

    It would be much cooler to promote good papers on your blog, and let the bad ones molder. Nobody cares about this paper. (And please, leave poor Lubos alone!)

  38. Scott Says:

    John #37: When I have promoted good papers on this blog, people have accused me of unseemly “hype.” That is, when they commented at all — I didn’t get nearly as many entertaining reactions as when I’ve ripped into bad papers!

    But, OK, point taken.

  39. Scott Says:

    Lubos #33: I agree that I was being overly generous to Anderson and Brady when I used the word “model” to describe their ideas. That they don’t have a well-defined model is precisely what lets them wiggle free of whatever experiments people bring to their attention.

    On the other hand, I respectfully disagree with the following statement of yours:

      it’s totally obvious that if one proposes any particular model that is “qualitatively inequivalent” to proper quantum mechanics, it has to contradict some experiments that are well-known and easy to do.

    No, it’s logically possible that someday, someone will invent a theory that agrees with QM on all experiments that have already been done or can be easily done with current technology, but that disagrees with QM on the states that arise (for example) in large-scale implementations of Shor’s factoring algorithm. I have no idea what such a theory would look like—and even if someone constructed one, I’d give long odds against its being true, unless it was somehow even more elegant and mathematically compelling than QM. And certainly, one would need to work thousands of times harder than Anderson and Brady are working to construct such a theory that gave sensible results on current experiments. But I do regard the problem of ruling out such theories—what I once called Sure/Shor separators—as a wonderful scientific project, and indeed, as one of the main intellectual reasons to try to build scalable quantum computers. It’s sort of like proving P≠NP or the Riemann Hypothesis—other examples where “reasonable people already agree on the answer,” and yet we stand to learn a great deal from the journey.

  40. Bram Cohen Says:

    Does this paper deny that Bell inequality violations happen in the real world? Claiming that there isn’t any experimental evidence for that is… a bit of a stretch.

  41. Robert Brady Says:

    Scott #29. The test is very specific. Can you do a quantum computation representing numbers greater than 16?

    For example, the factors of 21 are 3 and 7, both of which are less than this limit, and so this computation can be performed, unlike a factorisation of, say, the product of two primes greater than 100.

    The experiments discussed here do not attempt to do any computation. Instead they report individual correlated or entangled items, which you call ‘qubits’ as if they could be used in a computation. In my field of many-body theory there are millions upon millions of similar entangled entities, for example the entangled spins in a ferromagnet or the electron pairs in a superconductor. They are not called qubits because you cannot do a quantum computation with them. To the contrary, they are called a ground state and are usually subtracted out.

    I do not think it is creationism to point out, with references, why you cannot do a computation with these entities, nor is our paper unfalsifiable for the same reason.

  42. Greg Kuperberg Says:

    Bram – The tone of the paper is, yes, there have been Bell violation experiments, but the experiments have loopholes. The authors don’t really believe quantum probability.

    (As I said, I can’t make sense of their objection to reference [36].)

  43. Scott Says:

    Robert Brady #41: I see. 21 doesn’t count since, although it’s greater than 16, its prime factors are not. Can I have you on record that, if someone uses Shor’s algorithm to factor 51 into 3×17, that will change your mind? (Or will it still not count since, when written in binary, 51 consists only of 1’s and 0’s?)

    Regarding many-body experiments: “usability for computation” is not some magical pixie-dust that infuses certain physical systems while leaving all the others untouched. When you have a large entangled quantum state of hundreds or thousands of particles, the burden is on the QM skeptics—i.e., the scientific radicals—to explain exactly how to account for that state and its evolution within their classical model. It’s not on the people who accept standard QM—i.e., the “conservatives”—to demonstrate that, in principle (if not yet in practice), you could do all the things with the state that standard QM says you could do.

  44. John Sidles Says:

    The wrangling with regard to the feasibility (or not) of fault-tolerant quantum computing has much to lean (as it seems to me) from the wrangling with regard to the reality (or not) of anthropogenic climate-change:

    • Little is gained when the strongest enthusiasts confront the weakest skepticism, and

    • Little is gained when the strongest skeptics confront the weakest enthusiasm.

    Conclusion  The skepticism of the Anderson/Brady preprint is insufficiently strong (in its mathematical physics) to justify the opprobrium that quantum computing enthusiasts are heaping upon it.

    What would be deeply thrilling (to me) would be an arxiv preprint that expressed skepticism of quantum computing comparably forceful to Edgar Djikstra’s skeptical computer science essay Go To Statement Considered Harmful … perhaps along the lines Hilbert Space Considered Harmful.

    The essay Hilbert Space Considered Harmful would replace Djikstra’s dictum GOTO non fingo with an analogous C^n non fingo, that is, a demonstration that for very many (all?) real-world dynamical systems, the effective state-space is very much smaller than a Hilbert space.

    This would set the stage for faith in the absolute physical reality of Hilbert space, not to be overthrown in a radical revolution, but rather to fade gracefully into irrelevance, as more powerful mathematical methods of dynamical simulation replace it … rather like the moderating effect that democracy exerts upon Europe’s twelve monarchies!

  45. Lou Scheffer Says:

    Lubos #33 and Scott #39,

    I agree with Scott that Lubos’ statement

    it’s totally obvious that if one proposes any particular model that is “qualitatively inequivalent” to proper quantum mechanics, it has to contradict some experiments that are well-known and easy to do.
    is not at all clear. This is *exactly* what happened with general relativity. It’s qualitatively different than Newtonian physics, it produced results that agreed with all the well known results, but predicts very different results in the strong-field, high speed regimes. (The one existing test at the time, the precession of the perihelion of Mercury, is not at all an easy test. There’s a classical explanation as well involving the J2 moment of the sun, which is hard to rule out since the Sun does not rotate as a solid body).

  46. Raoul Ohio Says:

    Slightly related: Does all probability follow from QT?

    http://www.sciencedaily.com/releases/2013/02/130205151450.htm

  47. Luboš Motl Says:

    Dear Scott #39, you write:

    “No, it’s logically possible that someday, someone will invent a theory that agrees with QM on all experiments that have already been done or can be easily done with current technology, but that disagrees with QM on the states that arise (for example) in large-scale implementations of Shor’s factoring algorithm…”

    Sure, it’s also logically possible that someday, a creationist theory explaining all the data like evolution and many more will be proposed, too. It’s logically possible that Elvis Presley will send us greetings from the Moon, saying that he has enjoyed it over there.

    Logically possible is almost everything in science. A different question is whether it’s at least likely enough so that you would expect such a thing occur once per age of the Universe if it occurred every second. And in this sense, all the three things I mentioned are de facto impossible.

    I understand that you consider a quantum computer running Shor’s or different algorithm to be something in between the Earth and Heaven – after all, we don’t have it yet. My suspicion is that you also want to say that its possible existence is disputable because this makes your field look like unsettled, ongoing research near the frontiers of physics knowledge – which it’s actually not because in reality, quantum computation is just an advanced engineering application of rudimentary physics insights of the 1920s and 1930s.

    However, if you have an objective reason why you think that a quantum computer running Shor’s algorithm is “extreme” so that QM could fail in it, well, I am certain that you won’t be able to define any physically natural – non-spiritual – quantity according to which it would be extreme. It may have 100 or 1,000 qubits and this may look large relatively to the devices that have already been constructed.

    However, this number of qubits/particles is tiny relatively to the numbers in other physics systems where quantum mechanics has been tested and confirmed. Take a piece of metal, it has 10^{26} atoms and a ground state. The ground state is a particular (and rather generic, from an a priori viewpoint) linear superposition of tensor products of states of many electrons (and other particles) and it has the properties dictated by quantum mechanics.

    Because the ground state of this system – and many other, very different systems – is just a “rather random” linear combination of the basis vectors you may obtain from 10^{26} qubits or any other large number, it makes it pretty much inevitable that 1) all the other basis vectors in the multi-qubit Hilbert space are allowed states, 2) one can make any superpositions because if one seemingly random (one that only differs by low energy) combination is allowed, and it always works, it pretty much by statistics implies that all of the other also work.

    So by checking several (well, millions of) different systems, we have verified that the required Hilbert space indeed grows exponentially with the number of degrees of freedom and that all superpositions are allowed (the superposition postulate holds) because some “random ones” are always allowed. It’s implausible you will find a loophole although we would have to define a “loophole” rigorously to be able to rigorously decide whether such a class of theories may already be fully falsified. (The tail that will remain “unfalsified” will be written as equations of QM plus some tiny corrections, and it will be possible to show that a viable, non-falsified theory is just a built as an unnatural mutation of quantum mechanics.)

    Incidentally, if we talk about the “a priori” restrictions on the Hilbert space – the kinematic Hilbert space, so to say – that’s something that folks in quantum gravity know very well. Quantum gravity *does* invalidate the locality in the strict sense (it’s needed for the Hawking radiation to restore the information about the initial state, even though it’s apparently coming from a causally disconnected region). The qubits in region A and qubits in region B, when too dense, aren’t quite independent from each other. One may only achieve those configurations for which the object isn’t “too much mass concentrated within the Schwarzschild radius”. If there’s too much mass/energy in a region, it collapses into a black hole and the quantum computer (and its usual description) breaks down (and is crushed a second later).

    But just like we know that this nonlocality and refusal to acknowledge complete independence of various regions exists, we also know that it’s very far and its effect on mundane, low-energy, low-density, quantum-computing-like experiments is negligible. The restriction only occurs when the density of information approaches 1 bit per Planck area (if we measure the surface of the region which is relevant because quantum gravity is holographic) which is 10^{-70} square meters. If you tried to impose restrictions on the Hilbert space of a local quantum field theory that would start to operate at much larger areas, it would be equivalent to claiming that G, Newton’s constant, is much larger i.e. gravity is much stronger. Because the black holes clearly saturate the entropy and they would be restricted, you would reduce your idea about the black hole entropies and it would become impossible for some stars to collapse into black holes – because such a process would disagree with the second law of thermodynamics, and so on.

    We have actually lots of other physics reasons and arguments – not just playing with individual qubits at the “fundamental level” – to be sure that our description is right even for heavily multi-body systems. Thermodynamics really allows us to work with entropy and we do measure entropy experimentally. A certain entropy simply does mean that there exist exponentially many mutually orthogonal (classically mutually exclusive) states in the Hilbert/phase space. This may be verified by thermal experiments. To ban most of the microstates would mean to seriously reduce the heat capacity of the object. But we just know that the heat capacity of a proposed design for a quantum computer can’t be much lower than predicted by quantum mechanics. It’s just a piece of matter and similar matter has been subjected to tons of thermal experiments since the 19th century. When cooled near absolute zero, only some degrees of freedom survive but the entropy will still be extensive – it’s been tested with a great accuracy – and in the quantum framework, it means that the dimension of the Hilbert space grows exponentially with the size.

    Whether people will be able to overcome all the technical problems with QC – and whether there is a mistake in the proofs that QC is immune against various types of real-world inaccuracies and noise – may be an open question. But the fact that somewhat larger systems with 100 or 1,000 qubits obey the postulates of quantum mechanics as safely as systems with 1 or 2 or 3 or 4 qubits is something I am ready to bet my head upon (as in guillotine) simply because I know that we have verified QM on systems with a small number of particles as well as much larger numbers of particles. QM isn’t just a theory of small systems; it’s a theory of all systems and it’s essential whenever the classical limit is unjustifiable.

    I think that if you work 1,000 times harder than Brady and Anderson, you will not get a working, inequivalent, non-quantum description of the phenomena for which quantum mechanics seemed to work. Instead, if your work is impartial, you will end up seeing what I already see now, namely that such an alternative theory just can’t work. Well, maybe you have to work 10,000 more hard than Brady and Anderson but the result – a clear understanding of these basic conceptual issues – is a price that deserves and justifies this 10,000-times-harder work. 😉

    All the best
    Lubos

  48. Scott Says:

    Luboš #47: As you might recall, I’ve offered $100,000 for a convincing argument that scalable QC is impossible in the physical world, and I don’t have that kind of money to throw around casually. If you like, my offer corresponds, not to my assigning the speculation in my comment an 0.0000000000001% probability (like Elvis on the Moon), but certainly to my assigning it an 0.1% probability or less. I won’t go much lower than that, simply because, just because we can’t think of a “non-spiritual” way to separate the states arising in Shor’s algorithm from the many-particle states arising in current experiments, and even have what look like good arguments against that (e.g., your entropy argument, which of course assumes QM), doesn’t tell me there isn’t such a separator that would be obvious to physicists of future generations. Sure, it would require a deformation of QM unlike anything we’ve seen, but I don’t see that it takes us into “Elvis lives” territory. And also, I’m pretty sure an examination of the history of physics would show that discoveries people previously would’ve assigned an 0.0000000000001% probability, have happened at least 0.1% of the time. 🙂

    But even if—as you and I strongly predict—the effort to build a scalable QC “merely” results in actually building a scalable QC, I still say we would’ve learned something important. And not merely because of the intellectual inadequacies of the people who had smidgens of doubt. By analogy, if someone proved the Riemann Hypothesis, I wouldn’t say that person had “merely” achieved the same level of certitude as a physicist who’d long ago told all his friends that he was “morally certain” of the hypothesis’ truth, with no proof! Having a proof is qualitatively different.

  49. Nex Says:

    Very interesting paper. This soliton model is something I would like to see thoroughly explored as I am always interested in classical models of QM behavior, I certainly agree that inability of deriving QM from classical physics might simply be a failure of imagination.

    As for Bell inequalities, the paper does not deny them, but claims another loophole due to the fact solitons in this model propagate in common density wave background so to speak (if I am getting it right), but I agree they should explain in much more detail how this loophole works and why it lets them circumvent Weihs and Salart experiments.

  50. Scott Says:

    John Sidles #44: The reason I commented on this paper was simply that lots of people asked me to!

    In general, though, I disagree with the idea that in any intellectual dispute, we have a moral obligation to respond exclusively to our opponents’ best arguments, and to let their dumbest, most egregiously-wrong arguments pass completely without comment. Yes, if we care about truth we’d better respond to the former. But puncturing the latter can also be a useful service to the public—besides being easy and fun! 🙂

  51. Ajit R. Jadhav Says:

    Scott #43:

    Robert #41 makes two points. Let me deal with each, separately.

    1. I don’t know anything towards settling that point about “16” limit.

    My hunch would be that the authors would be wrong in prescribing such a limit. However, since I don’t understand either their theory or TCS in general to the sufficient extent (and that’s why I was insisting on his supplying a C++ program), I am unable to see if the authors have an intuitively unbelievable but rigorously provable result.

    For an example of such a theorem (intuitively unbelievable but actually provable), I can cite the case of Huygens’ principle supposedly not working out in 2D. Completely unbelievable, but “true.” (http://www.physicsforums.com/showthread.php?t=148787)

    BTW, if you ask me, IMO, that theorem is based on a wrong way of understanding Huygens’ principle. But, yes, if you grant the mathematicians’ definition of what constitutes that principle, then, sure, the proof, by itself, is valid. It’s just that those definitions themselves are different from what people understand on the common-sense physical grounds. Those definitions themselves are only tenously understandable. Further, they (and the 2D limit) can be made completely irrelevant in a physically sound and simpler view of Huygens’ principle.

    The point is: since I don’t understand the authors (A+B), I would allow them (at least for the time being) the possibility that they could be on to something similar: something that is only tenuously understandable, and is a minor quirk of theory that doesn’t at all matter in practice (just the way I can always apply Huygens’ principle also in 2D—following another definition of the principle: a local definition).

    2. However, when Robert (#41) comes to this: “They are not called qubits because you cannot do a quantum computation with them,” he does have an absolutely valid point. He also points out the reason why.

    This point of his remains valid, even if the objection concerning the absence of an explicit model for entanglement which Slava (#35) points out, also remains relevant! The authors (A+B) need to respond to that.

    Enough, for the time being.

    Ajit
    [E&OE]

  52. Slava Kashcheyevs Says:

    Ajit #51:

    The deal breaker is that classical models, however beautiful, have no capacity to approach any problem that “orthodox” quantum mechanics solves using the notion of entangled states (aka many-particle superpositions), regardless of interpretation.

    The point of my blog post is that this set of problems inaccessible by A+B covers an overwhelming majority of situations to which physicists have every applied the quantum formalism. Quantum computation is just a modern, tiny subset. The fluid mechanics approach to de Broglie waves is dead since 1920s and has no bearing on limitations of QIP.

    Scott #50:
    If the 3 points in your original post is what you have identified as “opponents’ best arguments”, I’d counter that these are not even arguments, just unsubstantiated statements. In my physicist’s view, their best argument is Brady’s explicit model, so I attacked it (for the same social reason as yours – people have asked me about it).

  53. Scott Says:

    Slava #52: Thanks for sharing!

      If the 3 points in your original post is what you have identified as “opponents’ best arguments”…

    No, I wasn’t suggesting anything of the kind! I’m not sure that I know how to identify the “best” arguments for why QC must be impossible, any more than I know how to identify the “best” arguments of the creationists or 9/11 truthers. On the other hand, in many previous posts on this blog (most recently here), I’ve addressed anti-QC arguments that at least weren’t in direct contradiction with existing experiments, and that I found noticeably more informed and interesting than this one.

  54. John Sidles Says:

    Scott (#50 upon #44) “In general, though, I disagree with the idea that in any intellectual dispute, we have a moral obligation to respond exclusively to our opponents’ best arguments, and to let their dumbest, most egregiously-wrong arguments pass completely without comment.”

    LOL … Scott’, isn’t it the case that post #50 knocks down a strawman that no one has ever advocated?

    To articulate the “Monarchy” critique (of #44) more completely, it commonly happens that FTQC enthusiasts defend the “House of Hilbert monarchy” by arguing that the sole alternative to the House of Hilbert is an nonviable anarchy of theories that are physically incomplete and/or mathematically immature and/or just plain wrong.

    However, the text associated to Wikipedia’s amusing gallery of European monarchs suggests a third alternative for the future of quantum dynamical STEM studies:

    “Most of the monarchies in Europe are constitutional monarchies, which means that the monarch does not influence the politics of the state: either the monarch is legally prohibited from doing so, or the monarch does not utilise the political powers vested in the office by convention.”

    When we reflect upon the evolution of the quantum dynamical literature, we perceive that during the early 20th century, the House of Hilbert Space reigned over science with all the uniting vigor of Wilhelm I.

    Nowadays however, the STEM-wise influence of the House of Hilbert is greatly diminished, as practical dynamical computations increasingly employ mathematical frameworks that do not extend naturally to unitary evolution upon state-spaces of any dimensionality.

    In consequence, the role in science of today’s House of Hilbert is evolving to be less evocative of the intimidating authoritarianism of Kaiser Wilhelm I, and more evocative of the comforting and popular — but nowadays largely ceremonial — role of Beatrix, Queen of the Netherlands.

    Summary  The House of Hilbert formally reigns over the STEM enterprise, but in practice doesn’t. FTQC enthusiasts envision the restoration of the House of Hilbert as an absolute STEM monarchy … yet that restoration is comparably as likely to happen, as Queen Beatrix of the Netherlands seizing the reins of power.

    Conclusion  The Aaronson $100,000 wager is fiscally safe (both now and in the foreseeable future). But operationally, the Aaronson wager is lost already (largely at present and increasingly in the foreseeable future).

  55. Luboš Motl Says:

    Lou #45, I disagree that general relativity is a “qualitatively different” explanation of the gravitational force than Newton’s theory. The full exact theory starts with more advanced, more geometric principles but when one focuses on the actual gravitational force in the contexts previously described by Newton’s theory, it’s strikingly obvious that general relativity is just a deformation of Newton’s theory. It may be reorganized as Newton’s theory plus corrections that go to zero in the “c to infinity” and “G to zero” double limit. That’s what I don’t call a qualitatively different explanation of the force.

    On the other hand, things like refuting the very existence of the exponentially many states and their arbitrary superpositions *is* a qualitative denial of basics of quantum mechanics, it would be a qualitatively different theory. Of course that *within* quantum theory, one may deform existing theories – their Hamiltonians – by adding new fields or other degrees of freedom and new interaction terms to the Hamiltonian etc. But that’s not a deformation of the postulates of quantum mechanics that have to stay completely constant because any nonlinear or other deformation of the postulates of quantum mechanics would lead to a logically inconsistent theory, e.g. one in which P(A or B) isn’t equal to P(A)+P(B)-P(A and B).

    Cheers
    LM

  56. Scott Says:

    John Sidles #54:

      The Aaronson $100,000 wager is fiscally safe (both now and in the foreseeable future). But operationally, the Aaronson wager is lost already (largely at present and increasingly in the foreseeable future).

    I don’t know what the hell that means. I’ll tell you what: you can have all of my “operational” money, if I can have just half of your “fiscal” money!

  57. Ross Anderson Says:

    Nex #49: thanks; this is exactly our intention – to see how far a classical model of QM can be taken. It was really surprising to get a decent model of the electron; what more can we do?

    Ajit #51 and Slava #35: in the sonon model, two particles are entangled if their \chi waves are phase coherent.

    Slava #52 and Lubos #47: You are right to say that a lot more work is needed before mainstream physicists will accept the sonon model as an explanation for QM. Lubos, you say we need to do 1000 times more work. For reference, Robert spent 50% of his time last year working on this and I spent perhaps 5%, so call it half a year. Spending 500 person-years on classical models of QM would cost $50m and would presumably need a DARPA BAA spread over a dozen universities for five years. I can’t see us making that sale just yet. Slava, I fully agree that Robert’s model is our best argument; you want us to extend it to cover the standard model, the exchange interaction, the gyromagnetic ratio, superconductivity and much else. Again, this is a lot to ask at this stage. But would you be prepared to take sonon theory more seriously if we came up with further non-trivial results, such as on superconductivity or the weak interaction?

  58. Ajit R. Jadhav Says:

    Slava #52:

    >> “The deal breaker is that classical models, however beautiful, have no capacity…”

    Yeah… I did appreciate that point though I didn’t jot it down explicitly. … But…

    There are times when one doesn’t want to be hair-splitting.

    It’s obvious that the moment you say: the classical, you immediately forgo: the quantum-mechanical. By definition. Sticking to the definitions, this part is very obvious. The point isn’t that.

    The point is this: Even if someone puts forth a new view of QM that (ultimately mistakenly) is advertised as being “classical,” and even if it’s not a “complete” theory addressing all the postulated aspects of QM, but if this new view actually has sufficient departures from what the term classical strictly means and demands, to make it interesting, then: in passing judgments, should we be making appeals to definitions? I think not. … And, in fact, I think, in your post at your own blog, you, too, actually did not.

    So, the deal-breaker isn’t that they describe it as a “classical” model; the deal-breaker seems to me to be that their description is not (even near-sufficiently let alone “completely”) quantum mechanical.

    I use the C++ program as a heuristic device. I mean it in the sense that: the basic argument has come from theory—a simulation cannot be a substitute for a theory. Yet, its enormous utility in ensuring “specific-ness” and “completeness” of description should be obvious.

    I mean, suppose that you already had this program for A+B’s theory (made available by them). Wouldn’t it then be so very easy to ask them to identify the line of code where they began dealing with, say, the entanglement (within a self-advertised “classical” framework)? Or, with the superpositions? Or, with the QM “collapse” (per the Copenhagen interpretation). That’s what I meant by “specific-ness.” Programs compell you to be specific.

    As to “completeness,” a (good) simulation would help prevent all the futile discussion based on those afterthoughts—it would force the programmer to fully incorporate the QM aspects, simply because their absence would be so directly noticeable. I would expect a QM simulation to be capable of addressing at least the single-particle double-slit interference situation, if not also the Bell’s inqualities, the delayed choice eraser, etc. The double-slit interference would make for enough of “completeness,” in practice. Including a fairly comprehensive indication of the handling of the auxiliary data.

    Of course, to be fully satisfactory, the program documentation would also have to show how its design and implementation fully addresses a decent postulatory description of QM, say as is found in any standard UG text on the mainstream QM (e.g. Eisberg & Resnick/Griffith/Gasiorowicz/etc). The modeling situation itself may be elementary and only an example (as in the double-slit interference). But the simulation has to show how the program at least implicitly implements for one specific application case the entire set of the QM postulates—and in what sense.

    BTW, if it’s a new view of QM, it would have to differ in some sense from the mainstream QM, just the way SR deviates in some ridiculously small but still in principle quantiable sense from the Newtonian mechanics, even at the “everyday” low speeds. The simulation will have to identify how—in what kind of limit its description converges to the mainstream postulatory QM (which is an unsatisfactory and a broken description, IMO).

    Alright. May be, this reply has become too long and boring. Just wanted to jot down what I meant, what it is that I am usually looking for, and why.

    * * *

    Enough for now. Will check back tomorrow.

    Ajit
    [E&OE]

  59. Robert Brady Says:

    Scott #43 I hope it is clear why we claim it is an order of magnitude harder to produce numbers greater than 16 using Shor’s algorithm. You suggest a quantum computation that is required to calculate the number 3. This would not be a contradiction because 3 is less than 16.

    On the second part of your response (and thank you for your input Ajit #51)

    1. As a graduate student I learnt many-body theory, and I am sure we share the experience. But your response seems to suggest you think there is something wrong with this theory. If so, what precisely is wrong with it?

    2. It is the usual procedure in many-body theory to treat unexcited entangled items as a ground state. Do you disagree with this procedure?

    3. The ground state acts as a single entity. Brian Josephson shows this explicitly for superconductors in the reference cited. Do you think there is something wrong with that?

    4. A single entity can encode at most one qubit for the purpose of computation. What is wrong with that?

  60. Greg Kuperberg Says:

    this is exactly our intention – to see how far a classical model of QM can be taken.

    It can’t be taken past the Bell inequalities, I can tell you that.

    It was really surprising to get a decent model of the electron; what more can we do?

    What more can you do? A decent model of two electrons. That’s the hard part.

  61. Scott Says:

    Robert Brady #59: I see, so both prime factors need to be greater than 16 in order to satisfy you. We should wait for the use of Shor’s algorithm to factor 323, then?

    Regarding your four-part syllogism, what’s wrong with it is that the only thing that justifies calculations that treat the ground state as a “single entity,” is the existence of a more fundamental theory, according to which there are actually far more degrees of freedom there than just one qubit (but they’re behaving collectively). Treating a many-body approximation as if it gave you the fundamental degrees of freedom, whole ignoring the degrees of freedom of the very theory (QM) that the approximation rests on top of, is like building a house on air. You don’t get to do that without first laying more solid foundations.

  62. Robert Brady Says:

    Greg #60, Scott (introduction) and others: regarding Bell’s inequality, does section 5 of this paper provide the information you require? I am afraid it does assume familiarity with Cramer’s transactional model and with Mead’s model, and of why they are consistent with Bell’s inequality experiments.

    Regarding the analogue of two entangled electrons, does Figure 5 of the same link, and the surrounding text, give you the information you require? If not I would be pleased to provide more.

  63. Bram Cohen Says:

    Brady #62: Are you claiming that there’s a classical mechanism which can give results which violate the Bell inequality?

  64. Robert Brady Says:

    Scott #61 Yes, both prime factors need to be greater than 16 since it is necessary to exclude calculations that might be done with fewer computational qubits.

    Have I understood you correctly? In the introduction to this blog, you comment approvingly on the Josephson effect. I don’t want to put words into your mouth, but you now seem to be saying that Brian’s original thesis ignores “the existence of a more fundamental theory, according to which there are actually far more degrees of freedom.”

    Can you elaborate?

  65. Greg Kuperberg Says:

    Robert – Of course it’s not satisfactory. Drawing a figure if two electrons is not the same as modeling two electrons with an equation. Listing a few citations to other people’s older models of quantum mechanics is also not the same as you giving a model of two electrons.

    You’ve made a complete muddle of the issue of Bell inequality violations. This is really key, because the entire topic of quantum computing is a gargantuan extension of Bell violations. In one paper there is an indecipherable claim that a quite strict Bell violation experiment (reference [36]) has loopholes. In another paper there is a figure and there are citations, but there is no equation for two electrons that either allows or prohibits Bell violations. One of the citation is to Carver, whose model is non-local and explicitly allows superluminal Bell violations, thus contradicting the need to claim loopholes in the other paper.

  66. Bram Cohen Says:

    Lubos #47: The difference with the other scientific theories you consider is that in the case of quantum computation there’s another very large piece of evidence – specifically, the extended church-turing thesis – arguing for the other side. (The thesis says that any reasonable model of physics can simulate any other reasonable model of physics with only polynomial slowdown.) Granted, the extended church-turing thesis is a more high level concept than a low level one, but it’s so extraordinarily robust that anything which violates it must be taken with extreme skepticism.

    Of course, when two fundamental scientific theories contradict each other that’s fertile ground for coming up with and performing experiments which force the issue. The entire field of quantum gravity is dedicated to exactly this sort of program, and there the two theories don’t even contradict each other, just don’t merge nicely. It can be hard to even design the experiment in some cases, and Aaronson has done the best job so far of proposing such an experiment, which has in fact actually been done, and thus far quantum computation has held up admirably. I wouldn’t say that it’s scaled up enough that I’m convinced that it won’t run out of juice eventually – there are, for example, analog classical mechanisms which in principle are able to do arbitrary calculations and work okay at a small scale but break down when scaled up – but I view this line of research as extremely important.

  67. Greg Kuperberg Says:

    Bram – Except that the polynomial Church-Turing thesis isn’t evidence, it’s theory. It’s a theory with accumulating evidence against.

  68. Robert Brady Says:

    Bram Cohen #63 Yes. Bell proposed experiments that test for either non-local or non-causal processes. Cramer’s transactional model exploits the non-causal element, which is also present in the solutions to Euler’s equation because it is time reversal symmetric. See section 5 of this paper for references and a discussion which is unfortunately rather brief and addressed to those familiar with these papers — perhaps it might be expanded upon.

    From an aesthetic point of view it would be preferable not to have to use the time reversal symmetry. It might be possible to do this, as explored in the joint paper with Ross.

  69. Scott Says:

    Robert #64: Fine, I’ll elaborate. If you accept QM, then the states created in these superconducting experiments are basically cat states, something like (|0⟩n+|1⟩n)/√2 where n is the number of electrons (which could be in the billions). Now, you point out correctly that, while the underlying Hilbert space might have 2billion dimensions, the Hilbert space relevant to these particular experiments is merely a 2-dimensional subspace, the one spanned by |0⟩n and |1⟩n. And you therefore declare yourself satisfied that “only one qubit” is there.

    Unfortunately, that’s not even the start of the beginning of an acceptable answer. As Greg #65 stressed, you don’t get to replace the precise predictions of QM by slippery verbal reasons-why-you’re-not-yet-proven-wrong that change from one experiment to the next. Instead, you need to replace QM by an alternate mathematical theory that

    (1) also describes anything that could possibly happen to a many-particle quantum system (not just one particular thing),

    (2) agrees with all experiments that have already been done, but

    (3) unlike QM, does not require an exponentially-large Hilbert space.

    The reason many people here are getting exasperated with you is that you seem to have no inkling of what would actually be involved in constructing such an alternate theory.

  70. John Sidles Says:

    In multiple comments, Shtetl Optimized readers have expressed skepticism that the mathematical framework of the Anderson/Brady preprint is adequate to build a viable theory of quantum dynamics upon it (and personally I share that skepticism).

    And yet, reasonable grounds exist to extend that same mathematically-grounded skepticism to orthodox FTQC. As was previously noted:

    Scott Aaronson #56

    John Sidles #54: “The Aaronson $100,000 wager is fiscally safe (both now and in the foreseeable future). But operationally, the Aaronson wager is lost already (largely at present and increasingly in the foreseeable future).”

    “I don’t know what the hell that means. I’ll tell you what: you can have all of my ‘operational’ money, if I can have just half of your ‘fiscal’ money!”

    Hmmm … to address Scott’s concerns, let’s make explicit the metaphorical argument (of #54), by focusing not upon “mere” money, but rather upon rational investments of every researcher’s most precious resource (students especially): time, attention, and imagination!

    Commonly students learn undergraduate-level quantum mechanics from texts that include (everyone has their own favorite list), Feynman’s Lectures, Dirac’s Principles of Quantum Mechanics, Gottfried’s Quantum Mechanics, Landau’s Quantum Mechanics. Nonrelativistic Theory, and Nielsen and Chuang’s Quantum Computation and Quantum Information (there are hundreds more).

    To pursue serious research, starting at the graduate level, an entirely new set of textbooks enters the picture, that include (again, everyone has their own favorites, and the following texts all are highly-ranked Amazon.com best-sellers) Spivak’s Calculus on Manifolds: a Modern Approach to Classical Theorems of Advanced Calculus, Frankel’s The Geometry of Physics, Nakahara’s Geometry, Topology and Physics, Lee’s Introduction to Smooth Manifolds, Nash’s Topology and Geometry for Physicists, and Zee’s Quantum Field Theory in a Nutshell.

    What is striking about this second list is how sparse references are to Hilbert space (in the narrow sense of Feynman/Dirac etc.). Slowly the realization dawns at the graduate level: “I’ve studied these texts in the wrong order! It’s better to study Spivak first, not last!” And indeed this modern mathematical sentiment is vigorously espoused by Amazon reviewers:

    When you are in college, the standard calculus courses will teach you the material useful to  engineers  physicists. You must pretty much forget the material in these courses and start over. That’s where you need Spivak’s “Calculus on Manifolds”. Spivak knows you learned  calculus  quantum physics the wrong way and devotes the first three chapters in setting things right.

    When quantum dynamics is appreciated through the lens of the second reading-list, it become apparent that rather little (if any?) of modern quantum research depends essentially upon the absolute existence of an exponential-dimension Hilbert/Dirac state-space. It is necessary only that the dynamical state-space be effectively Hilbert/Dirac … and the set of dynamical manifolds having this property is vast (as the second reading-list is at pains to inform students).

    In view of this burgeoning literature, and the increasing desirability (for students) of mastering the requisite mathematics before tackling the quantum physics, it is not unreasonable to foresee that (to paraphrase Einstein):

    “Hilbert space by itself, and unitary evolution by itself, are doomed to fade away into mere ideals — to which Nature herself may not entirely aspire — and only a symplectic union of the two will preserve an independent reality.”

    Indeed, the literature of the most recent decade amply documents that the 21st century’s inexorable supplanting from center-stage of the 20th century’s cherished Hilbert space already is well underway … and thus in the decades to come, a return of (our still-cherished) Hilbert space to center-stage of quantum dynamical research is about as plausible — and about as desirable too! — as (the still-cherished) Queen Beatrix reigning as absolute monarch of the Netherlands.

  71. Slava Kashcheyevs Says:

    Ross #57:

    No. Your use of \chi wave corresponds to treating the 2 sonons as bosons in the same quantum state – a coherent two-particle condensate which is a product state with no entanglement.

    Forget about about all the complex stuff. Just show the community how two electrons form a singlet and a triplet. No big deal, just a little element towards a minimally realistic approximation of helium / positronium.

    Unfortunately, there is no way you or anybody else can make it with a single-fluid hydrodynamic model. One \chi for all not what many-body quantum physics is about.

    And is Robert’s model is your best argument, then there no argument.

  72. Slava Kashcheyevs Says:

    BTW, as Joe has rightly pointed out on my blog, there is no way to tell whether sonons are bosons or fermions. End of story.

  73. Robert Brady Says:

    Many – so many comments, particularly on spin symmetry, entanglement, and Bell’s inequality, to which I will respond shortly.

    Scott #69 Thank you! I now understand. I was indeed referring to what you call the “underlying” theory beneath the measured entangled states. I had wrongly assumed you were too.

    The “underlying” theory is described in Brian Josephson’s thesis. On page 18 he introduces the operator exp(i N \theta) which connects states with N and N-2 electrons in a bulk superconductor. Its value is S = exp(2 i \theta) where \theta turns out to be the phase observed in a Josephson junction. In this way, Brian reduces the large number of electrons down to a single parameter – the phase – which is measured in a Josephson junction. This is the parameter which underlies the correlated phenomena you refer to.

    At this underlying (Josephson) level, the phase can code for at most one computational bit, even though it encompasses a very large number of electrons in the ground state.

    Are we moving towards agreement, at least on this issue? If so, that would be some progress on the primary subject of this blog!

  74. Scott Says:

    Robert #73: No, we are not moving toward agreement. You keep talking about the effective theory of one specific collective phenomenon, and I keep trying to get you to focus on the only general theory we know (QM) from which that effective theory can be derived — a theory that implies the existence of vastly more degrees of freedom in the system, which could be probed by some other experiment if not by this specific one. At this point, I basically throw up my hands: I’ve explained it to you as clearly as I know how. Maybe someone else can take a crack at explaining it.

  75. John Sidles Says:

    Robert Brady #73, in the fifty years since the 1962 Josephson thesis that you cite, a tremendous amount has been learned regarding antisymmetrized quantum dynamical state-spaces (per arXiv:math/0005202, arXiv:math/0208166, and arXiv:1110.6367 for example).

    Without in the least laying claim to an expert level of personal expertise, it seems (to me) reasonable to anticipate that substantial advances in our physical understanding of quantum dynamics will at least refer to these substantial (and ongoing) advances in our mathematical understanding of quantum dynamical state-spaces (in all of their variously symmetric/antisymmetric/asymmetric varieties).

    PerhapsShtetl Optimized readers can identify fundamental advances in quantum physics understanding that were not accompanied by and/or authorized by and/or catalytic agents of fundamental advances in mathematical understanding?

  76. Robert Brady Says:

    Slava Kashcheyevs #71 #72
    Greg Kuperberg #65
    and others interested in spin symmetry and the standard model.

    Much more could indeed be done on particle symmetries. This is what we know for the R11 sonon.

    (a) It has spin-half (Fermi) symmetry, as can be seen from the Pauli matrices after equation 8 here.

    (b) Their lowest energy state has spins opposed, as can be seen from the interaction energy (equation 9) and the symmetry of the spherical Bessel function discussed thereafter.

    The R10 sonon has a lower order symmetry (since n=0), and the higher families of sonons presumably have more complex symmetries. The details might be an area for further investigation.

  77. Raoul Ohio Says:

    Discussion of Blogs, Comments, etc, from Sci Am:

    http://blogs.scientificamerican.com/a-blog-around-the-clock/2013/01/28/commenting-threads-good-bad-or-not-at-all/?WT_mc_id=SA_WR_20130206

    SO fits into this picture.

  78. Greg Kuperberg Says:

    Robert – I am not all that super interested in spin symmetry or the standard model. All I said was that you don’t have a credible discussion of Bell inequality violations. Some discussion, yes, but nothing that hangs together at all.

  79. Bram Cohen Says:

    Brady #68: What section 5 makes clear is that you’re clearly proposing a classical system. The problem then is that Bell’s theorem isn’t a guideline or principle, it’s a theorem, so no amount of you pointing to complex mathematical machinery in references is going to get people to read them, because you might as well be claiming to have found a way to trisect an angle with a compass and straightedge.

  80. Anonymous Says:

    Hey, everyone, it’s hopeless. You are not going to convince Brady of anything.He obviously has no understanding of the vast range of physical phenomena that cannot be explained without quantum mechanics, and how well tested and tightly constrained our current physical models actually are.

  81. Bram Cohen Says:

    Greg #67: You could say that the second law of thermodynamics is ‘just a theory’ as well, that doesn’t stop people who claim to have violations of it from rightfully being dismissed as cranks out of hand. Granted the second law seems *more* fundamental than ECT, but that’s like saying helium is less common in the universe than hydrogen. Note that I’m not dismissing the evidence against ECT, just expressing far more skepticism that QM will continue to hold up as the experiments progress than some others have.

  82. wolfgang Says:

    I just read this paper and I found it quite amazing:

    First there is the reference to a video clip with Morgan Freeman (quite interesting how the drops bounce around on the vibrating plate), then they mention deBorglie-Bohm, an interpretation which is completely equivalent to Copenhagen (and others), at least in the non-relativistic case, and finally they present a ‘soliton model’ of the electron, which reminds me of Lord Kelvin’s vortices in the ether proposal of the 19th century (but I think L.K. made more sense).
    This is then used to make an argument about quantum computers.

    If this is physics in the 21st century then I want the 20th century back…

  83. Scott Says:

    wolfgang #82:

      If this is physics in the 21st century then I want the 20th century back…

    No, this isn’t “physics in the 21st century”—it’s just two guys trying to overturn modern physics, far from the first or the last. Of course there’s a selection effect; stuff that’s actually representative of “physics in the 21st century” is less likely to lead to emails asking me for comment or to an annoyed blog post like this one. 🙂

  84. Greg Kuperberg Says:

    Bram – At this point the second law of thermodynamics is almost a mathematical theorem rather than a separate physical theory. That’s different, that’s something that you would already believe even if it weren’t tested. In any case it has been confirmed many times, except in regimes (such as cosmology) where it doesn’t apply without modification.

    There is no theorem supporting the polynomial Church-Turing thesis within the current laws of physics. On the contrary, quantum probability is true and within quantum probability, the polynomial Church-Turing thesis is close to disproven rather than proven. So you shouldn’t believe the polynomial Church-Turing thesis, for the same reasons that you should believe the second law of thermodynamics.

  85. Slava Kashcheyevs Says:

    Robert Brady #76

    Sure, I have understood from your paper the claim that a sonon has spin 1/2 and two sonons couple anti-ferromagnetically. This does not bring you any closer to construct a signlet or demonstrate the exchange symmetry (boson/fermion/anyon).

  86. Robert Brady Says:

    John #75 Good comment. See the 50 year update conference.

    Scott #74 Oh yes we do agree! 🙂 Fortunately, you do not need to rewrite Brian’s thesis in order to analyse quantum computing using Josephson junctions. If the Josephson phase changes by 2 \pi around a loop, this is called a flux quantum. Tunnelling of flux quanta was first observed by John Clarke (see conference link above). In fact, these quanta appear to behave like individual quantum mechanical entities. The analysis you describe can be applied to them and the usual results of quantum computing follow.

    Each individual flux quantum is a collective phenomenon. I hope you will agree it does not contain millions upon millions of computational qubits — unless you think Josephson’s thesis doesn’t apply to these experiments, in which case please specify!

  87. Robert Brady Says:

    Bram Cohen #79 I understand your question. Let me describe why the motion of sonons is consistent with Bell’s analysis.

    The coherent motion of sonons at low velocity obeys equation 11, and its trajectory obeys equations 12 and 13. Equation 11 is the same as the Schrodinger equation, and the probability of a trajectory reaching (x, t) is | \psi(x,t) |^2 (which follows from (12) and (13) — the paper reproduces Bohm’s reasoning). These are the same equations on which Bell’s analysis is based, and therefore it would be surprising if the motion of sonons were inconsistent with it. I do not think this is controversial, but please tell me if it is not clear.

    I think the debate is about how to interpret the consequences, which seem to be counter-intuitive. Cramer’s transactional interpretation of quantum mechanics is only one of the possible ways in the literature to interpret it; it was not intended to be a proof.

  88. Robert Brady Says:

    Slava Kashcheyevs #85 There is obviously a lot of detail required in these areas in order to satisfy you.

    I think your question, or at least similar questions that are relevant to particle physicists, is answered in the extensive literature on the subject. As you will know, the same compressible inviscid fluid is studied in the field of analogue gravity — see Barcelo’s review article. See in particular Volovik’s book regarding particle symmetries and the standard model.

    Happy reading!

  89. Robert Brady Says:

    Wolfgang #82 Thank you. There are some similarities, as you suggest. However, sonons are irrotational, unlike vortex atoms. You may want to look at the online talks at the recent conference on tightly knotted and linked systems, which in some respects are the successors to Lord Kelvin’s vortex atoms.

  90. Lou Scheffer Says:

    Lubos #55,

    You say: ‘I disagree that general relativity is a “qualitatively different” explanation of the gravitational force than Newton’s theory.’

    Of course you are entitled to your own opinion of ‘how different’ two theories are, but I suspect you are the only person on the planet with this view. Newton’s is an action-at-a-distance theory in a flat space-time with no mechanism. GR provides the equivalent of Newtonian forces via shortest paths in curved space-time. It was so weird at the time that only a few people even understood how it might work. For example, Planck said, of combining gravity with SR:

    “As an older friend, I must advise you against it, for, in the first place you will not succeed, and even if you succeed, no one will believe you.”

    Herman Weyl, another rather sharp guy, said of GR:

    “It is as if a wall which separated us from the truth has collapsed. Wider expanses and greater depths are now exposed to the searching eye of knowledge, regions of which we had not even a pre-sentiment.”

    A typical (I believe) modern view is that of Ashtekar:

    “Space-time is not an inert entity. It acts on matter and can be acted upon. […] There are no longer any spectators in the cosmic dance, nor a backdrop on which things happen. The stage itself joins the troupe of actors. This is a profound paradigm shift [that]… shook the very foundations of natural philosophy. It has taken decades for physicists to come to grips with the numerous ramifications of this shift and philosophers to come to terms with the new vision of reality that grew out of it.”

    I’d be very surprised, but impressed and interested, if you can find similar support for your proposition the GR and Newtonian gravity are qualitatively similar.

  91. John Sidles Says:

    Greg Kuperberg (#84) says: “The second law of thermodynamics is almost a mathematical theorem rather than a separate physical theory.”

    The word “almost” is smile-inducing because it calls to mind so much STEM history:

    • The Earth is almost flat.

    • Malaria is almost invariably associated to the bad night air of swampy regions.

    • The Parallels Postulate is almost self-evident (and/or almost a mathematical theorem).

    • Planck’s radiation law almost follows from Boltzmannian statistical mechanics.

    Recent work such as — to cite one article of many — Derezinski, De Roeck, and Maes “Fluctuations of quantum currents and unravelings of master equations” (2007, arXiv:cond-mat/0703594) is exemplary of contemporary efforts to close the “almost” gap to which Greg Kuperberg’s comment #84 refers. The Anderson/Brady preprint is (as it seems to me) relatively less sophisticated, less successful, and therefore (arguably) less promising in regard to further progress.

  92. Bram Cohen Says:

    Brady #87: That is not at all clear. Could you answer simply whether your model allows non-local phenomena?

  93. Ajit R. Jadhav Says:

    I think this thread has by now got numerous high-quality comments.

    At this point, I have to raise a few questions:

    To A+B’s critics/detractors:

    Is the real point of contention the very idea that what A+B propose is claimed to be a classical model?

    To A+B:

    1. You make reference to Cramer’s interpretation. As the nine formulations paper states, Cramer’s interpretation quantitatively “makes no predictions that differ from those of conventional quantum mechanics.”

    In the mainstream QM, there is instantaneous action-at-a-distance (IAD)—i.e., IAD, as distinguished from mere entanglement. For instance, in the Copenhagen interpretation, the wave-fuction collapse requires IAD. Inasmuch as your theory produces results that are quantitatively identical to the mainstream QM theory, your theory also involves/entails IAD. Am I correct?

    2. Why must the fluid be compressible? Does it have a deep but not very obvious relevance in imparting the specifically quantum-mechanical character to your theory?

    Finally, to wrap up, here’s a suggestion to A+B:

    I have touched upon this point above, but wish to highlight it again, separately. I think it would help your cause if you explicitly establish how your theoretical constructs correspond to or lead to the postulates of the mainstream QM, esp. the nonrelativistic QM. In deference to the 80/20 rule, personally, I would suggest writing an article that’s accessible to someone who hasn’t read anything beyond the first half of Quantum Chemistry by McQuarrie. More sophisticated accounts could then address the remaining 20% (or even just 2%!) of the objections/queries.

    Ajit
    [E&OE]

  94. Greg Kuperberg Says:

    Equation 11 is the same as the Schrodinger equation.. These are the same equations on which Bell’s analysis is based… I do not think this is controversial, but please tell me if it is not clear.

    Actually, it’s beyond controversial. Equation 11 is the *single-particle* Schrödinger equation. In order to violate Bell’s inequalities, you need the *multi-particle* Schrödinger equation. If you don’t have that, then the entire discussion is nonsense.

    Besides, you clearly are arguing in the alternative. In your comments here you accept Bell violations, but in your other paper you dismiss them as the result of loopholes.

  95. Slava Kashcheyevs Says:

    Robert #88

    It may look like very complicated and requiring “a lot of details” to you, but there is hardly more elementary exercize in two-particle quantum mechanics than constructing and classifying symmetric and anti-symmetric states. This where sonons fail hopelessly.

    My point was not request you re-do all of physics, but to point out an overwhelming amount of evidence contradicting your model .

    But I’ve said enough. I’m not challenging anything (except the relevance of your sonon model to real particles), so the burden of proof is not on me. Have fun.

  96. Luboš Motl Says:

    Lou #90, I am not silly so be sure that I know all the differences between GR and Newton’s theory, too. But my point is that GR doesn’t restrict the data describing Newton’s gravitational forces. On the contrary, it adds some new degrees of freedom – the metric tensor which may sustain gravitational waves even in the absence of sources – which is made necessary by the fact that the force in GR has to obey the cosmic speed limit, the speed of light.

    But what is discussed here is a qualitatively different theory that would *steal* something from quantum mechanics. Clearly, the tensor-product-like exponentially growing Hilbert space (with all the complex linear superpositions allowed) seems too large and complicated to the authors discussed in this thread. So they want something “simpler” really in the sense that it subtracts the number of possible states.

    My comments about GR’s being a deformation of Newton’s theory were just an example of my broader claim that there doesn’t exist a single precedent in physics in which a working theory would be superseded by a qualitatively different one that reduces the “space of states” of the older theory. This just won’t happen. Using the words relevant here, it won’t happen that quantum computers will be made impossible because a hypothetical better, future theory will prohibit the entanglement or arbitrary superpositions with many qubits.

    Such a hypothetical evolution is indefensible, contradicts all known laws of quantum mechanics, has no historical precedent, and is only motivated by certain people’s limited intellectual abilities because these people just find QM too complicated and its Hilbert space too large. But it will never get smaller.

  97. Slava Kashcheyevs Says:

    Gred #94

    “If you don’t have that, then the entire discussion is nonsense.”

    Yes, they don’t have that (but do not seem to realize it) and the entire discussion is nonsense.

  98. Robert Brady Says:

    Bram Cohen #92 Yes. As you would expect for a model consistent with Bell, the sonon energy is delocalised and the processes are not necessarily causal. See for example the delocalisation of the energy in the spin-correlated |ud> + |du> state in #14 here.

  99. Anonymous Says:

    Greg #94: It’s even worse than you say. NO form of the Schrodinger equation (single particle, multi-partilce, whatever) is needed to show that QM violates the Bell inequalities. (The only thing one needs to say about time evolution is that the spin state does not change as the particles fly to the detectors.) Rather, it’s the general structure of the spin state, together with the Born rule, that lead to the violation of the Bell inequalities. What Bell showed is that this general structure of the spin state CANNOT be reproduced by ANY theory of classical probabilities that does not have instantaneous action-at-a-distance.

    In 1985, David Mermin wrote a fantastic article for Physics Today, “Is the moon there when nobody looks?”, which gives a beautifully clear explanation of Bell. It’s behind the paywall at Physics Today, but a google search turns up several places where it is freely available.

  100. Anonymous Says:

    And, in many experimental tests of Bell, it’s PHOTONS, not ELECTRONS, that are used.

  101. John Sidles Says:

    Luboš Motl broadly claims “There doesn’t exist a single precedent in physics in which a working theory would be superseded by a qualitatively different one that reduces the “space of states” of the older theory.”

    The assertion is incorrect: the four-letter “GCAT” hereditary state-space of DNA — as broadly foreseen by von Neumann in a 1946 letter to Norbert Weiner — is a tight restriction of the (foggily envisioned) larger-dimension protein-template hereditary state-space that was espoused in the 1930s and 40s by luminaries like Linus Pauling.

  102. RobvS Says:

    Luboš Motl #96
    “Using the words relevant here, it won’t happen that quantum computers will be made impossible because a hypothetical better, future theory will prohibit the entanglement or arbitrary superpositions with many qubits.”

    I like the idea to do a trick like Eric Verlinde did with gravity. Making the “old” theory indeed a statistical average (coarse graining) of the “new” theory. This not only incorporates entanglement. The holographic principle demands massive amounts of entanglement.

    But I like that approach better mostly because my (lack of) math skills make it impossible for me to understand (super) string theory. Some thermodynamics I do understand.

  103. Robert Brady Says:

    Ajit #93 Thank you.

    Yes, the emergence of quantum motion from completely classical motion might well seem unintuitive, even after you have seen the videos of Couder’s experiments.

    No IAD – Instantaneous action at a distance is impossible in Couder’s experiments, even though they faithfully reproduce tunnelling, double-slit diffraction etc. Likewise, on our model, the probability a trajectory passes through (x, t) is just |\psi(x,t)|^2 and so there is no need for any wavefunction collapse, instantaneous or otherwise.

    The fluid must be compressible for the very ordinary reason that the speed of sound is theoretically infinite in an incompressible fluid.

    Thanks for the suggestion of a simple paper.

  104. Robert Brady Says:

    Greg #94 and Slave #95 Many thanks. I accept I did not provide an explicit spin superposition and show how it is measured.

    This is now here for the |ud> + |du> spin states of two R11 sonons. I hope it is clear how to do the others from this example.

    I am afraid R11 sonons are spin-half and we are not ready to publish with spin-1.

  105. Anonymous Says:

    In #103 we see that Dr. Brady (as predicted) still just doesn’t get it. (Trying html tags, let’s see if they work.) Reproducing some aspects of quantum phenomena with a local classical model is certainly possible, but reproducing all aspects is certainly not. That’s what Bell proved. But it was strongly suspected long before Bell, since all attempts to construct such a model had failed.

  106. Greg Kuperberg Says:

    Anonymous #99 – Certainly the two-particle Schrödinger equation, with or without spin, does violate Bell’s inequality and other Bell-type inequalities. And certainly one single particle cannot violate any such inequalities in any straightforward way. It also doesn’t matter whether you use photons or electrons. Bell violations are a pervasive phenomenon of quantum probability as it applies to almost any type of joint quantum state.

  107. Scott Says:

    Greg #106: Actually, you can perfectly well violate a Bell inequality with just a single particle, in the “entangled” occupation-number state |0⟩|1⟩+|1⟩|0⟩! It requires a more subtle measurement, but apparently it’s even been demonstrated experimentally. This is a point that I was long confused about myself, but see for example this delightful paper by van Enk.

  108. John Sidles Says:

    On further reflection, Luboš Motl’s comment (#96) provides us with that valuable entity, a Great Truth (namely, a Truth whose opposite also is a Truth):

    Luboš Motl broadly claims “There doesn’t exist a single precedent in physics in which a working theory would be superseded by a qualitatively different one that reduces the “space of states” of the older theory.”

    In addition to the genetic example (of #101), we have also:

    A mathematical example  The restriction of elliptic curves to finite fields yields (along with much elegant mathematics) elliptic curve cryptography.

    A condensed matter/field theory example  Ken Wilson’s renormalization group method systematically replaces (many) microscopic degrees of freedom with (fewer) macroscopic degrees of freedom, so as to usefully make physical sense of (i) phase transitions and (ii) the divergences of field theory.

    Thus we appreciate the dual aspects of …

    Luboš Motl’s Great Truth  In the 21st century the 20th century’s Dirac/Hilbert quantum dynamics foreseeably will — or foreseeably won’t! — “be superseded by a qualitatively different dynamics that reduces the ‘space of states’.”

  109. Anonymous Says:

    Greg #99: We appear to have a semantics problem. By “the Schrodinger equation”, I meant just the time-evolution equation of QM. Violations of Bell in QM are not dependent on how the system evolves in time. As best I can tell, what you mean by “the Schrodinger equation” includes the entire superstructure of QM (states, observables, Born rule, etc.). I was trying to distinguish this superstructure from the Schrodinger equation itself.

  110. Anonymous Says:

    Oops, I meant Greg #106.

  111. Greg Kuperberg Says:

    Scott – Almost, but I don’t think that you actually can. You can certainly create that state which is entangled in an occupation number basis. However, you will only see non-locality if you apply measurements that have a chance of creating a second particle. So, no dice I think.

    In any case, certainly the actual Bell violations are done with two particles in an entangled state, even if you could in principle measure two boxes with entangled occupation.

  112. Scott Says:

    Greg #111: This paper by Babichev, Appel, and Lvovsky claims to have actually achieved an experimental Bell violation (subject to the usual detection loophole), using a single delocalized photon as the sole entangled resource. (Yes, I’m sure the measurements involve additional particles both on Alice’s end and on Bob’s end, but so what?)

  113. Greg Kuperberg Says:

    Scott – This is interesting enough that I must keep quiet until I understand it better. 🙂

  114. Ajit R. Jadhav Says:

    Robert #103:

    Oh, you are welcome! But…

    Let me wrap up, somewhat at a length. (I will sure check back for comments and all, but as far as I am concerned, the wrap up for this thread seems to be fast approaching.)

    1. I do think that a part of the problem lies with the way you (+Ross) have written the paper—it covers too much of territory, too fast.

    For an astonishing prediction setting concrete limits on the number of coherent cubits, prior discussion is so sparse as to be almost absent. I was interested in the 3D case, and so did a word search on “four” in your A+B paper. The only places it appears is in the abstract and conclusions! That’s rather like the Copenhagen quantum—it’s there only when measured, at emission (abstract) and absorption (conclusion). … I also dare suggest that you once again check your logic. Chances are very extremely bright^{bright} that the result holds only under a restricted set of auxiliary conditions.

    2. BTW, you said (#103) no IAD, but you still didn’t quite directly clarify if your theory makes predictions that are quantitatively identical to those of the mainstream QM theory, or not. (By mainstream QM, I mean any of the nine+ interpretations or treatments in (students+Dan Styer)’s paper.)

    The reason I insist on this part is that I myself have had a preliminary (conference) paper on a new approach to QM (of only photons, so far); my approach in principle leads to a quantitatively different prediction (though I don’t know except in broad outlines how to work out its detailed maths).

    3. Coming back to your research: A simpler paper is certainly needed, but also a paper that at least addresses all the stages of the quantum evolution in a simple example case, if not also presenting a working C++ simulation for it.

    4. Also, I would suggest: In that paper, please make a clean break from Couder et al’s work. It simply confuses people.

    It’s obvious that Couder’s work does not reproduce all aspects of QM. Even if we assume a simplest model of the universe consisting of just electrons + photons, if the dancing droplets are taken to be electrons, it’s obvious that the since the waves induced by the droplets are the force-carriers, in this model, they should represent photons. However, in the Couder model, such “photons” are not quantized—they are not localized in space, as the real photons in a single-photon-at-a-time diffraction experiment would show. Naturally, the Couder model is insufficient in terms of how much of a quantum character it can fake. (And that is apart from the very simple question that had struck me as soon as I read about it the very first time around 2010, the same time it got covered in the MIT News: Who/what vibrates the universe? especially in the 3D? My other question was: In 3D, how precisely does a droplet induce waves?)

    Now, yours is a different model. It is a “purely” mathematical model, not fully realizable in a classical experiment. The classical fluid isn’t inviscid. Qua a mathematical model, it would be possible to overcome the limitations of the Couder model in it. If so, why make a reference to the Couder model at all?

    5. Finally, I sense that I might have other issues about your sonon model. I mean some deeply physical issues (not mathematical); .e.g., things like: the existence of the singularity at the sonon surface, i.e. the very existince of a sharp boundary surface. The Aristotlean law of the excluded middle entails that a physical theory cannot carry singularities; they can only be projected (i.e. imagined) mathematical entities/features, without any physical existence. Or, as Roger Schlafly’s blog highlights: “natura non facit saltus.”

    And, I would seek a detailed picture of the interaction of an “electron-type” (i.e. the Fermi/matter particle) sonon with a “photon-type” (i.e. the Bose/force-carrier) sonon—including, whether, and if yes, precisely how an electron-sonon absorbs a photon-sonon, what the pair physically looks like after the absorption; what makes the electron-sonon emit the photon-sonon, etc.

    6. To (finally!) wind up:

    If in theory you take a clean departure off Couder’s model (it can continue to be a part of a motivation section but little more), supply the correspondence with the postulates, and then if you could also supply a comprehensive account (ideally, with a C++ program) of an elementary but complete case (e.g. double-slit diffraction), apart from addressing issues like the above, I would be very, very happy to read it. And, I am sure, many others, would, too. So, kindly keep us posted.

    Best,

    Ajit
    [E&OE]

  115. Scott Says:

    Luboš #96:

      My comments about GR’s being a deformation of Newton’s theory were just an example of my broader claim that there doesn’t exist a single precedent in physics in which a working theory would be superseded by a qualitatively different one that reduces the “space of states” of the older theory. This just won’t happen.

    Why isn’t the holographic principle, which reduces the naïvely infinite-dimensional Hilbert space of QFT to the finite-dimensional Hilbert space of quantum gravity, a counterexample to the above claim?

  116. Luboš Motl Says:

    Dear Scott #115,

    the holographic principle isn’t an example because, as you correctly said, the Hilbert space of quantum gravity is infinite-dimensional only “naively”, not according to a working theory.

    There hasn’t ever been a working, internally consistent theory of quantum gravity that would have an infinite-dimensional Hilbert space even for finite regions. This is an inconsistent assumption and this situation is different from quantum mechanics of 6 qubits which is an internally consistent – and experimentally tested – theory in physics.

    Of course that from some broader viewpoint, namely if you allow some somewhat inconsistent theories to the mix, the holographic principle *is* an example of exactly what I say has no examples. But as I already discussed above, it’s an example that happened and could happen only in some very extreme regime and it had physical consequences.

    The claims of the type “quantum computers aren’t allowed by the right laws of physics” are, on the contrary, claims about a completely non-extreme, low-energy physics that has been tested indefinitely so one can’t find any meaningful inequality that would separate the regime in which QM works as tested and in which it would be replaced by a “smaller” theory.

    Cheers
    LM

  117. Simon J.D. Phoenix Says:

    One can observe violations of the mathematical inequality we call the Bell inequality with single particles – and this can be used to furnish a single particle QKD scheme (the correlations existing between state preparation and measurement). The eavesdropping test then amounting to determining whether or not the inequality is violated.

    I think it was Boole who showed that if we have 3 random variables A, B and C, then the joint probability P(A,B,C) that correctly reproduces the marginals P(A,B) etc can only be constructed if the marginals satisfy what we call the Bell inequality today.

    Any proposed classical model of entanglement must therefore be able to reproduce this ‘non-existence property’ for P(A,B,C) for certain choices of A, B, and C and that’s before we add non-locality into the mix.

    I was surprised by the A + B paper – Ross is a well-known, and respected, figure in the security community and he’s done some really cool stuff. Readers of this blog will understand what I mean when I implore Ross not to become a Christian!

  118. Lou Scheffer Says:

    Lubos #96 says:

    “There doesn’t exist a single precedent in physics in which a working theory would be superseded by a qualitatively different one that reduces the “space of states” of the older theory.”

    Even special relativity and QM could be counterexamples. In Newtonian mechanics, the velocity could be anything; in SR it’s limited to the subset less than the speed of light. In classical mechanics, a harmonic oscillator can have any energy, but in QM only a subset are available, reducing the state space from an uncountable infinity to a countable one.

  119. T H Ray Says:

    Scott #115

    I don’t want to comment on the merits of the Anderson-Brady paper in a site dedicated to its perfunctory dismissal. Where the philosophy of science is concerned, however, Lubos has quite valid arguments.

    Even though the holographic principle is not a true theory, it would not counterexample his claim. Just as relativity subsumes Newtonian physics in the limit, holography constrains the physics of quantum mechanics to the limit of physical manifolds. In this respect, at least, it extends relativity in the same context that relativity extended Newtonian mechanics.

    Point is, as Lubos implies, there is a backward-forward relation between all physical theories and principles; none exist in a thought-vacuum. This may be different way of saying that there is something rather than nothing, but it isn’t trivial.

    (Hope you can stay out of the weather today in Boston.)

    Tom

  120. Luboš Motl Says:

    Dear Lou #118,

    nope, the same problem with the range of validity affects your other examples, too. One may define the validity of classical physics to be one in which the speeds at smaller than a limit, c, and validity of classical physics in regimes in which the angular momentum or action or (delta x)*(delta p) are much greater than Planck’s constant.

    The point is that these old theories weren’t established as theories for all regimes, however extreme they are. So non-relativistic mechanics fails at high velocities, classical physics fails at tiny angular momentum, actions, or attempted tiny uncertainties of momentum and position, the local theories of gravity fail when one tries to compress too much information (like black hole entropy) into a small region.

    But the new theories always confirm the state space in the relevant approximation. That’s different than the claim here because 5 qubits isn’t extreme in any sense, yet those folks want to claim that basic QM becomes invalid. There isn’t any quantity such as speed, angular momentum, action, products of uncertainties, entropy density per area or anything else that would be extreme in mundane low-energy systems with 5 qubits, so if one claims that QM is wrong, he’s claiming that it’s giving totally wrong predictions everywhere which it clearly doesn’t.

    Don’t tell me you don’t understand what I am saying. In all your conventional examples, the newer theory almost perfectly confirms the older theory’s description of all mundane experiments one may do in the labs. This case is claimed to be different because even the mundane things are claimed to be wrong in the old theory – QM.

    Cheers
    LM

  121. John Sidles Says:

    Luboš Motl proposes a another Great Truth:

    Luboš Motl proposes (#116) “Claims of the type ‘quantum computers aren’t allowed by the right laws of physics’ are claims about a completely non-extreme, low-energy physics that has (or has not?) been tested indefinitely.”

    LOL  we appreciate the duality of Luboš’ Great Truth when we reflect (as one example) upon the persistent confusion and controversy — both experimental and theoretical — regarding the quantum Third Law, and in particular quant-ph/0703152 illustrates how subtle these issues can be.

    When relativistic gauge field theory enters (as it always does in designing practical experiments) the “non-extreme, low-energy physics” becomes even more subtle. Example, what obstructions have (so far) prevented the theoretical literature from reliably assessing the feasibility of scalable Aaronson/Arkhipov n-photon source/detector systems?

    Conclusion  Introductory quantum texts — like Feynman’s Lectures and Nielsen and Chuang’s Quantum Computation and Quantum Information for example — commonly skirt certain “completely non-extreme, low-energy physics” theoretical issues … but 21st century experimentalists and engineers are not permitted this luxury!

    That is why numerous creative & insightful articles are continuing to extend our still-immature understanding of these “completely non-extreme” quantum physics topics. A long journey toward understanding awaits us … which is good!

  122. Lou Scheffer Says:

    Lubos #120,

    There are two different statements here. The first is that a new theory is bogus if it cannot reproduce the well-known results from the previous theory. On this we agree completely.

    The second is that you can tell that a theory is bogus if it reduces the state space of the previous theory. This I do not believe since it is entirely possible that the problem with the old theory is that the state space was too big (bigger than reality). This is exactly what happened with QM – the old theory, with the old state space, gave results that contradicted experiment (there was no ultraviolet catastrophe, and atoms did not radiate until they collapsed). By reducing the state space to quantized values these problems were fixed. Importantly, this restriction did not screw up previous well-verified results using macroscopic objects, which were shown to be a limit of the new theory.

    I am in no way defending the new QM theory discussed here – we both agree it’s bogus. However, it’s not bogus just because the new state space is smaller – it’s bogus because it contradicts existing experiments.

  123. Luboš Motl Says:

    Dear Lou #122,

    I believe that I have already clarified the statement that one cannot reduce the state space of the previous theory. I am talking about the state space for a particular situation – such as an 8-qubit experiment in a low-energy lab considered here.

    In all the historical examples, the space of states was preserved and/or “infinitesimally” deformed or extended by things that are invisible in the everyday situation, and so on. In this 8-qubit case, it’s claimed that the space of states has to be something qualitatively different which *does* violate the known observations because the known observations imply the laws of physics that inevitably hold for the 8-qubit situation as well – simply because there’s no conceivable variable that would become more extreme in the 8-qubit case and that would invalidate QM in this context while preserving its experimentally tested success in the well-known contexts.

    LM

  124. John Sidles Says:

    Great Truth (version III)

    Luboš Motl now asserts (#122): “In all the historical examples, the space of states [of new dynamical physics?] was preserved and/or ‘infinitesimally’ deformed or extended by things that are invisible in the everyday situation, and so on.”

    Even by a generous interpretation, it’s hard (for me) to extract useful lessons from Luboš’ most recent assertion.

    The evolution of the concept of entropy provides an instructive case history. In classical physics entropy is a well-posed geometric entity: the logarithm of symplectic volume of a level-set. And in quantum physics entropy is given as a well-posed algebraic entity: von Neuman’s logarithmic trace. Yet it’s far from self-evident (to me) that the latter entropy is an “infinitesimally deformed” (in Luboš’ phrase) version of the former entropy.

    Q1  Are there any thermodynamical textbooks that even attempt a formal mathematical demonstration that these two definitions of entropy are (for practical purposes) equivalent?

    Q2  Are there any texts that provide even a qualitative explanation of why this question hasn’t been easy to answer rigorously?

    ———–

    Conclusion  One lesson of history (as it seems to me) is that Lou Scheffer’s post #122 provides solid common sense guidance. Thank you, Lou, for that excellent post!

  125. Luboš Motl Says:

    Dear John #124,

    the reason why you don’t understand that von Neumann entropy is just the quantum deformed version of the log of the volume in the phase space is that you don’t understand basic physics.

    The logarithm of the volume in the phase space is just the Shannon entropy from a statistical distribution that is uniform over the volume (and normalized) and the von Neumann entropy is nothing else than the Shannon entropy in which the probability distribution has been uplifted to an operator, just like everything in quantum mechanics. At any rate, the generalization is totally straightforward because the eigenvalues of the density matrix play exactly the same role as the individual values of the classical probability distribution on the phase space.

    In both cases, the logarithmic formulae are multiplied by Boltzmann’s constant k, in order to get the entropy that was first extracted in the thermodynamic limit and that has values of order one in situations with macroscopic numbers of degrees of freedom.

    If you don’t understand that all these formulae for entropy are really the same, you should repeat your basic undergraduate courses of statistical mechanics and thermodynamics instead of pretending that you are participating in discussions about cutting-edge physics.

    Cheers
    LM

  126. Robert Brady Says:

    Wrap-up — Thank you all for your comments. It has been stimulating to interact on a site dedicated to a collective refutation of our papers on quantum computing and the irrotational motion of a compressible inviscid fluid.

    My key take away: in order to convince the quantum computing community, we need to analyse the symmetries and interactions of two-body and many-body sonon spins, and show explicitly they obey the statistics in Bell’s original paper. It is not enough simply to observe that the relevant equations are the same as those of Cramer’s and Mead’s models and therefore they have this property.

    And yes, it is reasonable for this community to ask for further progress in that direction. Time for us to roll up our sleeves — if others do not get there before us.

    Thank you again

  127. Hal Swyers Says:

    “This is exactly what happened with QM – the old theory, with the old state space, gave results that contradicted experiment (there was no ultraviolet catastrophe, and atoms did not radiate until they collapsed). By reducing the state space to quantized values these problems were fixed”

    Hold your horses compadre. This is pretty far off the mark. I think its pretty clear to anyone that the state space of observables hasn’t really reduced at all. I can arbitrarily boost any reference frame and make any portion of the state space accessible. What QM did was show that there are certain stable time independent solutions which have a somewhat privileged status. I would argue that there is a limit on our sampling of state space, but state space has shrunk in any sense.

  128. Scott Says:

    Hal Swyers #127: Yes, it’s an ironic feature of QM that it shrunk the “effective” state space of the orbiting electrons, only by dramatically expanding the “true” state space!

  129. Bram Cohen Says:

    Has anyone else noticed that Brady is using this as an opportunity to link to his papers as many times as possible to boost his SEO?

  130. John Sidles Says:

    Luboš Motl says: “If you don’t understand that all these formulae for entropy [Boltzmann, von Neumann, Shannon] are really the same, you should repeat your basic undergraduate courses of statistical mechanics and thermodynamics.”

    Thank you for your generous advice, Luboš Motl, which exhibits a commendable verbal vigor! Now let us consider its foundations in mathematics and physics.

    As was mentioned earlier (#70), it is regrettably true a great many natural dynamical systems are excluded from “undergraduate courses of statistical mechanics and thermodynamics.” E.g, when considering the classical-to-quantum pushforward it is natural to ask questions like: What is the Boltzmann/von Neumann/Shannon entropy of a (classical) ideal gas of Chaplygin Sleighs interacting by weak potentials? How can such systems be quantized most naturally?

    The dual set of quantum-to-classical pullback questions is similarly rich, and here — as one example among hundreds — it is a fun exercise to read the titles of Guifre Vidal recent articles as provisional answers to wonderful questions: A real space decoupling transformation for quantum many-body systems (2012, arXiv:1205.0639); and Entanglement renormalization and gauge symmetry (2010, arXiv:1007.4145); and Infinite time-evolving block decimation algorithm beyond unitary evolution (2010, PRB v78p155117). Obviously very many more such recent articles could be cited!

    The thrust of comment (#70) is that the 20th century undergraduate mathematical curriculum leaves 21st century physics and engineering students ill-prepared to appreciate the burgeoning literature on classical-to-quantum pushforwards, and the dual literature of quantum-to-classical pullbacks … not to mention the literature whose dynamical state-spaces are more-than-Newton/less-than-Hilbert (e.g. the tensor-product state-spaces cited above).

    What are the wonderful questions, to which the above-mentioned articles (and hundreds more) are providing us with enticing (but provisional and incomplete) answers? Aye, lasses and laddies, now *that’s* a central question for the 21st century STEM enterprise!

    Proposition  It may eventuate that the state-space of string theory — whatever that state-space may be! — is not the state-space of Nature. As for whether Nature’s state-space permits fault-tolerant quantum computing (or does not permit it) this issue too is wholly undecided at the present time. Yet in all eventualities, the transformative advances in the capabilities of practical systems engineering — classical, quantum, and the emerging hybridized methods — that are associated to the mathematical naturality of the pushforward/pullback methods of string theory, and the mathematical naturality of the informatic methods of quantum information theory, are sufficient already to more-than-justify society’s investment in stringy quantum informatic arcanæ.

  131. Hal Swyers Says:

    Scott #128 No doubt! Certainly QM is better endowed than CM, it just like to flaunt the fact! I think future generations will look back and laugh at those advocating for smaller spaces and ask, “Well, how then would we make the cancellations?”

  132. Amir Safavi-Naeini Says:

    I think “Bell’s Theorem? ‘Tis but a flesh wound!” should be a new category on your blog. (though it would overlap pretty strongly with “Rage Against Doofosity”..)

  133. Hal Swyers Says:

    couple of minor corrections…
    in #125
    “but state space has shrunk in any sense.”
    is
    “but state space has not shrunk in any sense.”

    in #131
    “it just like to flaunt the fact!”
    is
    “it just doesn’t like to flaunt the fact!

  134. Greg Kuperberg Says:

    In order to convince the quantum computing community, we need to analyse the symmetries and interactions of two-body and many-body sonon spins, and show explicitly they obey the statistics in Bell’s original paper.

    Yes, especially if you also retract claims that experimental Bell inequality violations are illusory. Then you would convince the community that you’re not trying to contradict quantum mechanics. You wouldn’t convince anyone that quantum computers can’t work.

  135. chorasimilarity Says:

    Why does not (pen+paper+QM textbook) count as a classical simulation of any QM system?

  136. Scott Says:

    Amir #132: Thanks very much! I’ve adopted your suggestion.

  137. John Sidles Says:

    chorasimilarity asks “Why does not (pen+paper+QM textbook) count as a classical simulation of any QM system?”

    The key is the efficiency of that simulation.

    It is striking, that in all of human history, no-one has ever measured an experimental data-set that (under standard complexity-theoretic assumptions) provably cannot be simulated with computational resources that are polynomial in the bit-length of that data-set. Moreover, thanks to ongoing “Moore’s Law” advances in computer hardware and simulation algorithms, the principle “all feasible experiments are efficiently simulable” is nowadays striking true even of quantum dynamical systems that formerly were considered to be computationally intractable.

    The Skeptic’s Postulate  The empirical simulability of feasible experiments reflects a law of nature that requires either fundamental modifications to QM, or else fundamental modifications to our appreciation of the experimental implications of QM (or both).

    The Enthusiast’s Postulate  No fundamental extensions of QM are required: it is necessary only that we be adequately ingenious in designing scalable means of fault-correction in quantum computing and/or scalable means of sourcing/sinking n-photon quantum states in Aaronson-Arkhipov experiments (etc.). The ensuing unsimulable data-sets will experimentally demonstrate that the Skeptic’s Postulate is wrong!

    ————

    The Shtetl Optimized comments (so far) shows us plainly that the weakest QM skeptical arguments are comparably unconvincing to the weakest QM enthusiastic arguments. As for convincingly strong arguments, none of the preceding 135 Shtetl Optimized comments has demonstrated (to me) that either the Skeptics or the Enthusiasts have any!

  138. Joe Shipman Says:

    Why do you compare monarchists and segregationists to QM-deniers? The monarchists and segregationists may have been losing the political battle, but it is not a consequence of their political theory that their success is assured, so failure does not invalidate the righteousness of their opinions. This is in contrast with Marxism, which DOES assert its historical inevitability and is therefore falsified by failure. On the contrary, the monarchists can point to the failures of politics in democracies due to the shortsightedness of politicans who only see as far ahead as the next election and find their theories confirmed by experience of non-monarchism. The case of the segregationists is similar; you can argue against them on moral grounds, but it’s not obvious that experience has disproved their theories.

  139. Scott Says:

    Joe Shipman #138: You make an interesting point. I suppose my analogy was based on the empirical fact that most (though not all) firm believers in a political ideology, also predict a future where increasing numbers of people will agree with them—or at the least, they don’t predict that their ideology will nearly vanish from the face of the earth. (If they did expect that, then being the herd animals most humans are, they’d probably switch ideologies!) Even if they predict a huge “temporary” setback (e.g., losing a war), they typically also predict that far enough in the future, the world will come to see the martyrdom and heroism of their cause.

    For this reason, I submit that, while it’s not necessary as a matter of principle, in fact most political ideologies are pretty tightly coupled to empirical predictions about the future of humankind: for example, “the world will come to see the rightness of superior races enslaving or exterminating inferior ones.” And many of those predictions have been pretty dramatically falsified. And those falsifications have indeed created huge problems for the modern “ideological descendants” of the people who made the predictions, at least if they care about history at all (many don’t).

    You’re right that all of this is most obvious in the case of Marxism (or, say, apocalyptic religions), which have included predictions—often falsified ones!—as explicit parts of their ideology. (Arguably Nazism also counts, because of its explicit prediction of the “Thousand-Year Reich.”) But my claim is that even the ideologies that don’t include “explicit” predictions—e.g., liberal democracy, segregationism, monarchism—almost always contain “implicit” predictions that are accepted by almost all their adherents, as a major reason for subscribing to the ideology at all.

  140. John Sidles Says:

    Scott’s thesis that quantum skepticism≡monarchy is a Great Analogy … whose opposite therefore assuredly also is a Great Analogy. To appreciate this we ask:

    A Quantum Trivia Question  In the years 1918–1933, seven physicists received Nobel awards for their seminal roles in the conception of quantum mechanics. How many of the seven were born natives of monarchies?

    Answer  Six-of-seven were born citizens of monarchies (Planck, Einstein, Bohr, Heisenberg, Schrödinger, and Dirac). The sole exception is named by the Nobel website as “Prince Louis-Victor Pierre Raymond de Broglie” … a hereditary French title of nobility!

    When we remark that David Hilbert was himself born under the reign of Prussian monarch William I, and that Nobel Prizes have always been presented by the reining Swedish Monarch (presently Carl XVI Gustaf!), the conclusion is unassailably evident:

    An Inarguable Fact  Belief in the absolute physical reality of Hilbert/Dirac quantum dynamics is at present, and historically always has been, nurtured by monarchy.

    Comments (#44) and (#54) adduce more evidence for the Great Analogy of (quantum enthusiasm)≡(nurture by monarchy), yet surely the historical evidence already cited will suffice to convince any “calm person”!

  141. srp Says:

    Scott #139:

    Your understanding of conservative intellectual sensibilities and ideology is seriously incomplete. Declinism, fatalism, etc. is default mode for a good chunk of the right. You can observe this for yourself at the famous end by noting the deep-rooted pessimism of Whitaker Chambers, who was sure he had switched to the losing side. You can observe it at the anonymous end by perusing blog comments on any anti-immigrant or social-conservative site. Excessive wallowing in gloom is actually a perennial vice that the more self-aware conservatives try to police. One reason Reagan made such an impact on the movement is that as a converted FDR Democrat he brought a dose of that optimistic happy-warrior spirit from the other side.

  142. Scott Says:

    srp #141: I’m well-aware of conservatives who wallow in doom-and-gloom prophecies; I even know the sort of fatalistic blog comments on social-conservative sites that you’re talking about. But I thought the whole appeal of a doom-and-gloom prophecy is the idea that, when the apocalypse finally arrives, the world will see that you were right!

  143. srp Says:

    Scott #142:

    I’m not sure who the post-apocalyptic audience would be for the I Told You So.. The really hard-core types believe in cyclical theories in which The Gods of the Copybook Headings are independently rediscovered after the decline and fall stage. But some do take satisfaction that they will be someday vindicated.

    Of course,modern environmentalism has similar “after you’re all boiling/poisoned/missing the pretty biota/genetically mutated/Soylent Green you’ll see I’m right” tendencies. Civil libertarian types sometimes entertain post-police state ITYS fantasies, too. Human nature cuts across ideologies and philosophies.

  144. John Sidles Says:

    Scott proposes  “My claim is that even the ideologies that don’t include ‘explicit’ predictions — e.g., liberal democracy, segregationism, monarchism — almost always contain ‘implicit’ predictions that are accepted by almost all their adherents, as a major reason for subscribing to the ideology at all.”

    Concrete historical support for Scott’s thesis may be found in two very readable surveys of belief systems: Rabbi Abba Hillel Silver’s scholarly A history of Messianic speculation in Israel from the first through the seventeenth centuries (1927) and Martin Gardner’s humorous (yet still scrupulously factual) works that include Fads and Fallacies in the Name of Science (1957) and Urantia: the Great Cult Mystery (1995). E.g. in Silver we read:

    The pathetic eagerness to read the riddle of Redemption and to discover the exact hour of the Messiah’s advent […] proceeded with varying intensity clear down the ages. At times it seems to be the idle speculation of leisure minds, intrigued by the mystery; at other times it is the search of people in great tribulation. […] Great political changes, boding weal or woe, accelerated the tempo of expectancy. […] The rich fancy of the people, stirred by the impact of these great events, sought to find in them intimations of the Great Fulfillment.

    Recent quantum computing works like the first-edition and second-edition QIST Roadmaps (LA-UR-02-6900, 2002 and LA-UR-04-1778, 2004) surely can be read as science, yet Scott’s proposition builds upon the great tradition of Silver and Gardner, in encouraging us to read the QIST Roadmaps also as technological prophecy founded upon belief in the absolute physical reality of Dirac/Hilbert state-spaces.

    Is there an element of ideological/Messianic fervor among the most ardent enthusiasts for quantum computing? A faith sufficiently strong, that the evident shortfall of the QIST timelines cannot shake it? What are the consequences of subjecting faith to the trials of science? If it happens that the messiah of FTQC tarries, for how many generations should physicists retain their devout faith in the inerrant scripture of Hilbert and Dirac? And in particular, why are the strongest defenses of orthodoxy so commonly lacking in humility and humor? These are the trans-disciplinary questions that first Silver’s essays, then Gardner’s essays, and now Scott’s essays, encourage us to ask.

    In Silver’s history we read Rabbi Jonathon’s “Perish all those who calculate the end, for men will say, since the predicted end is here and the Messiah has not come, he will never come”, and to make the same point more positively (and subtly), there is Maimonides’ creed “I believe with a full heart in the coming of the Messiah, and even though he may tarry, I will wait for him on any day that he may come!”

    Does Maimonides’ creed apply to the QIST Roadmaps? Should it? These are terrific questions!

  145. wolfgang Says:

    scott #142, referencing srp #142 😎

    >> the whole appeal of a doom-and-gloom prophecy

    I think the appeal is to ‘know’ that everything other people are doing is futile and foolish.

    The best example imho is zerohedge.com – predicting financial doom-and-gloom since 2009 (when the financial crisis hit bottom).

    Their followers witness other people (fund managers etc.) make lots of money (and they follow this in great detail), but they ‘know’ that in the end all will be lost, which explains why they are such a popular website.

  146. Scott Says:

    “scott #142, referencing srp #142”

    Sorry, fixed 🙂

  147. jonas Says:

    Oh no! Now you’re saying that you won’t be able to have the quantum and anti-quantum fanatics fight in a gladiator arena and cancel out each other because they’re on the same side?

  148. Eliezer Yudkowsky Says:

    Come to the Dark Side, we have cookies!

  149. John Sidles Says:

    Scott (#11) avers  “The arc of science is long, but it bends toward quantum.” 😉

    Excellent! 🙂

    Aside: the fabled Quote Investigator has researched the fascinating origins and evolution of this fine saying.

    Yet on the other hand we have:

    Henry Ford (apocryphal?) “If I had asked my customers what they wanted, they would have told me a  faster horse   proof of P≠NP  quantum computer.”

  150. RNH Says:

    John Sidles, are you a currently practicing physicist?

  151. John Sidles Says:

    Scott Aaronson poses the question: “of whether there’s any pair of quantum computing skeptics whose arguments for why QC can’t work are compatible with one another’s.”

    For the answer to Scott’s question to be “no”, then all forty-six of Gil Kalai’s tabulated objections to quantum computing (beginning here and ending here) would have to be mutually exclusive … which on probabilistic grounds alone, scarcely seems likely! 🙂

    Gil’s list is wonderfully compatible (as it seems to me) with celebrated passages by Donald Knuth and Derek deSolla Price:

    Donald Knuth  Science is what we understand well enough to explain to a computer. Art is everything else we do. [\ldots] Science advances whenever an Art becomes a Science. And the state of the Art advances too, because people always leap into new territory once they have understood more about the old.
    —————
    Derek deSolla Price  It is not just a clever historical aphorism, but a general truth, that “thermodynamics owes much more to the steam engine than ever the steam engine owed to thermodynamics.” […]

    The dominant force of the process we know as the Scientific Revolution was the use of a series of instruments of revelation that expanded the explicandum of science in many and almost fortuitous directions […]

    Historically the arrow of causality is largely from the technology to the science.[…]

    The history of science is only partly a flow of intellectual steps. The other part is the craft of experimental science, which is the part of the the history of technology. Each radical innovation in this craft tradition gives rise, not to the testing of new hypotheses and that theories, but rather to the provision of new information which affects what scientific theories must explain.[…]

    This process, which I describe as ‘artificial revelation,’ is at the root of many paradigm shifts, perhaps not all, but most. In these cases the paradigm shift comes about because of a change in the technology of science which may be rather trivial and is almost always an intruder from some vastly different current in the history of technology.

    These passages motivate us to regard Gil Kalai’s forty-six skeptical avenues as intertwining paths — some paths surely more promising than others … but which? — that lead generally toward the conception (in deSolla’s idiom) of a 21st century “synthetic revelation” that extends “the explicandum of quantum dynamics” with sufficient rigor that (in Knuth’s idiom) “the present Art of quantum dynamical simulation becomes a Science.”

    The Knuth/Price considerations lead us to reflect that perhaps it scarcely matters — for the next few decades anyway — whether the quantum dynamical state-space of Nature is absolutely non-Hilbert/Dirac versus effectively non-Hilbert/Dirac (to answer Scott’s question by citing two distinct-yet-compatible grounds for quantum computing skepticism).

    In either eventuality scalable quantum computing is infeasible (or is it?) … and yet many other capabilities — that are similarly wonderful, and possibly are more strategically important than quantum computing for a crowded overheating planet — are associated to various subsets of Gil Kalai’s forty-six skeptical possibilities.

    Conclusion  It is a truth universally acknowledged that various combinations of Gil Kalai’s forty-six skeptical avenues represent eminently hopeful paths for the future of quantum research.

  152. John Sidles Says:

    RNH asks  “John Sidles, are you a currently practicing physicist?”

    Such questions presuppose that there exists a distinct boundary between science and engineering, and yet this boundary isn’t easy to specify: did von Neumann write to Wiener (in 1946) as a fellow-scientist or as a fellow-engineer?

    To the extent that 21st century scientists and engineers are advancing toward increasingly overlapping objectives, that are described by increasingly overlapping mathematical languages (and are pursued by increasingly overlapping enterprises) perhaps the science-versus-engineering question — for many (most? all?) quantum researchers — is amenable to the same answer as was recounted by Lyndon Johnson, who told of a student teacher who was asked in a Texas job interview: “Do you teach evolution the Bible way or the Darwin way?” … to which the eager job-seeker answered: “I can teach it either way!”  🙂

    Conclusion  Strict-constructionists can reasonably argue that workers who are largely or entirely confident that the state-space of Nature is rigorously Hilbert/Dirac should self-describe as engineers. John Bell was among the first to embrace this practice (in 1983) … there have since been many more.

  153. Sam Hopkins Says:

    Scott: what is the point of critiquing and debating quantum skeptics with whom you disagree so fundamentally? We sometimes think that debate will lead to greater understanding (when, for instance, your opponent is Gil Kalai), but you don’t think anything like that of a lot of these crackpots. So work towards the quantum computer. When you can crack RSA codes, these skeptics will look like the Flat Earth Society.

  154. Raoul Ohio Says:

    Sam H.,

    I think you have missed the point.

    SO is largely reports from Scott as he tries to figure out the landscape on the QC frontier. Debating fine points with John, Greg, Gil, Lubos, and other insightful participants is no doubt a fun part of the exercise.

    Scott also takes on his share and more of refuting doofosity. Many avoid this chore, which is rather like pushing a garbage truck up a hill. Scott appears to be about 50/50 on sparing with QC skeptics and QC true believers.

    That is usually a good place to be in a debate, when both sides attack you.

  155. Rahul Says:

    No blog posts for 2 weeks?! I keep checking but no luck.

  156. John Sidles Says:

    The lamps have been going out, not only here at Shtetl Optimized, but throughout the quantum blogosphere.

    Has the 20th century’s explicandum of quantum information theory become too narrow to sustain viable 21st century STEM enterprises? Are the associated quantum explicandae insufficiently inspiring and/or enabling and/or natural?

    The remedies to these maladies are well-known: lively articles, enabling technologies, novel explicandae, new mathematical frameworks, and provocative posts and comments!

  157. Bram Cohen Says:

    Note that Brady didn’t actually answer whether factoring 17*19 would invalidate his model or why he spent so much time denying the validity of the numerous experiments demonstrating Bell Inequality violations if he accepts their results.

  158. vznuri Says:

    hi scott, you retrograde luddite you. think you’re spectacularly on the wrong side of this issue, but it could take decades to prove it. however, there is some recent physics/cosmology results by Beane that arguably tie into this that you should be aware of. do you have any comment on those elsewhere?

    hot off the press, a rebuttal that cites ‘t hooft, wolfram, cellular automata, solitons

  159. Scott Aaronson shreds a paper claiming QC won’t work. | Gordon's shares Says:

    […] Link. I missed this somehow. He left smoking cinders. Miss his writing, he’s spending too much time with his baby. (that was a joke) […]

  160. Robert Brady Says:

    Bram #157: If our model is correct then it will be an order of magnitude harder to factor numbers like 17*19 in a simple geometry. And another order of magnitude harder with larger numbers, and so on. I hope the reason for this is clear from the papers.

    It is an experimental fact that there are violations of the Bell inequalities. This is usually interpreted to tell us that particles have a delocalised character. As you will know, it is common in fluid dynamics for there to be structures with a well-defined position whose energy is delocalised. A vortex is one well-studied example, and the sonon quasiparticles are another.

    In this blog it appears to be assumed that quasiparticles in fluid dynamical systems cannot violate the Bell inequalities, even though their energy is delocalised. I should very much like to understand why this is believed. Could any of the contributors to this blog provide a reference? Thank you.

  161. John Sidles Says:

    Another voice is heard:

    Steven Weinberg concludes (in his new textbook Lectures on Quantum Mechanics)

    “My own conclusion (not universally shared) is that today there is no interpretation of quantum mechanics that does not have serious flaws, and that we ought to take seriously the possibility of finding some more satisfactory other theory, to which quantum mechanics is merely a good approximation”.

    There is a discussion of Weinberg’s view on the Quantum Frontiers weblog that (as it seems to me) deserves more comments than it is receiving.

  162. Attila Szasz Says:

    yet another reason not to want a world with
    a small limited number of qm-proper particles
    http://www.smbc-comics.com/comics/20130222.gif

  163. John Sidles Says:

    Thomas Vidick’s weblog MyCQState is presently hosting an outstanding essay/survey — written by Thomas himself — that is titled A Quantum PCP Theorem?.

    That the Anderson/Brady preprint has received 150+ comments, while Thomas’ excellent essay has (thus far) received no attention at all, is an imbalance that (as it seems to me) we can all help to remedy.

  164. Raoul Ohio Says:

    The semi-regular announcements of QC breakthroughs recounted in ScienceDaily got off to an early start this week:

    http://www.sciencedaily.com/releases/2013/02/130224142829.htm

    One thing that makes this week’s alleged breakthrough unusual is that there are some actual “QM looking” equations given in a figure. But — they are blurry so you can’t quite see what they are trying to say. If you click the “enlarge” button, the equations get bigger, but they are still blurry. Could this be a deep metaphor?

  165. chrisd Says:

    Robert #160: I don’t pretend to be an expert on sonons, but one of your presentations suggests that they are “incapable of quantum collapse”, so in particular the measurement postulate does not hold and it’s not clear that sonons can appear in a superposition state, let alone exhibit entanglement. As such, they don’t appear “quantum” so would not be expected to violate a Bell’s inequality.

    And i emphatically disagree that such violations are interpreted as showing the particles have a “delocalized character”. All they show is that quantum mechanics is incompatible with local realism. Big difference.

  166. Doriano Brogioli Says:

    Dear Prof. Aaronson, for sure, nobody will ever find that the “whole framework of exponentially-large Hilbert space was completely superfluous”!

    However, I would like to ask you (and to the readers) your idea about the fact that BQP is a subset of PSPACE. It seems that something huge is needed, but not necessarily a “large Hilbert space”: a very long calculation time can do the work. Do you think that this can say something fundamental on what QM is? Not an easy equation in a huge Hilbert space but an extremely difficult problem in a smaller space?

  167. Bram Cohen Says:

    Brady #160 writes: “If our model is correct then it will be an order of magnitude harder to factor numbers like 17*19 in a simple geometry. And another order of magnitude harder with larger numbers, and so on.”

    By ‘order of magnitude’ do you mean factor of 2 or factor of 10? By ‘harder’ do you mean more energy, or more precision, or something else? By ‘larger’ do you mean twice as large, or above some threshold, or what?

    “In this blog it appears to be assumed that quasiparticles in fluid dynamical systems cannot violate the Bell inequalities, even though their energy is delocalised. I should very much like to understand why this is believed.”

    Fluid dynamical systems are based entirely on local phenomena. See every paper on simulating them ever written. Anything which vaguely looks like action at a distance will have to operate by going through the intervening materials, and the speed of propogation of the effects will be limited by the speed of light.

  168. John Sidles Says:

    In token of respect and gratitude for recent quantum-informatic blogosphere posts by Gil Kalai, Aram Harrow, Thomas Vidack, John Preskill, and the Aaronson/Arkhipov collaboration (and other folks too!), I have posted to Gödel’s Lost Letter (what attempts to be) a unitary appreciation of their various perspectives in relation to the 2014 Simons Institute workshop Quantum Hamiltonian Complexity … which looks like it will be a terrific workshop!

  169. Rahul Says:

    Comments?

    http://www.sciencerecorder.com/news/scientists-discover-a-way-around-heisenbergs-uncertainty-principle/

    Scientists discover a way around Heisenberg’s Uncertainty Principle

    According to a pair of scientists from the University of Rochester and the University of Ottawa, there may be a way around Heisenberg’s famous Uncertainty Principle.

    According to a report published this week in Nature Photonics, a recently developed technique that allows scientists to directly measure the polarization states of light could be the key. The direct measurement technique, developed in 2011, allows scientists to measure the wavefunction – a way of determining the state of a quantum system.

    The pair of scientists say the new technique relies on a “trick” that measure the first property of a system, leaving the remaining parties untouched. The careful measurement relies on the “weak measurement” of the first property followed by a “strong measurement” of the second property, the pair writes in the report.

  170. Robert Brady Says:

    Bram #167 I appreciate your questions and comments.

    Yes, Euler’s equation contains only local interactions. Nevertheless, the energy and angular momentum of a vortex are delocalised in the fluid. I believe this means a vortex has at least some properties which are not localised at the core. A sonon has the same delocalised properties.

    Would you be convinced by an explicit proof of the spin correlation in Bell’s original paper (which he shows violates his inequality)?

    I am afraid I can’t quantify how ‘hard’ it would be to break the current experimental glass ceiling. You would have to get a single particle to lose coherence with system A, fall into coherence with B, revert to A and so on, with the net effect that it remains in coherence with both. I think this would be exceedingly difficult, but it is at least mathematically conceivable. If someone can achieve it they might be able to break the glass ceiling. If it’s not clear why, the presentation (on my web site) might be helpful.

  171. Scott Says:

    Rahul #169: Weak measurement is a decades-old idea. And it doesn’t in any way, shape, or form violate the Uncertainty Principle (as nothing can, without violating QM itself—in which case you would’ve heard about it!). In the case of weak measurement, the “catch” (i.e., the one crucial fact popular articles never tell you) is that you need an ensemble of many copies of the system to implement the measurement.

    YAWN… next! 🙂

  172. Scott Says:

    Doriano #166:

      I would like to ask you (and to the readers) your idea about the fact that BQP is a subset of PSPACE. It seems that something huge is needed, but not necessarily a “large Hilbert space”: a very long calculation time can do the work. Do you think that this can say something fundamental on what QM is? Not an easy equation in a huge Hilbert space but an extremely difficult problem in a smaller space?

    Actually yes, I’ve been telling people for a while that BQP⊆PSPACE is a deep and underappreciated fact about the foundations of quantum mechanics! (One of my laugh lines is that Feynman won the Nobel Prize in physics basically for pointing out that BQP⊆P#P⊆PSPACE—i.e., that you can organize QFT calculations as a giant sum rather than keeping a whole wavefunction in memory.)

    On the other hand, I don’t see this as a challenge to the Hilbert space formalism, but as a property of the formalism: a property of “modesty,” if you like. We never observe a naked state vector in the wild; we only ever observe the outcomes of measurements. And if you only care about predicting the outcomes of measurements specified in advance, you can ditch the notion of “states” almost entirely, and organize your calculations in a more efficient way (just how much more efficient being an active research topic). But as soon as you ask for the “state” of the system — i.e., for an object sufficient to probabilistically predict the outcome of any possible measurement that could be made in the future — the exponential character of Hilbert space comes roaring back.

  173. John Sidles Says:

    Scott posts  “But as soon as you ask for the “state” of the system — i.e., for an object sufficient to probabilistically predict the outcome of any possible measurement that could be made in the future — the exponential character of Hilbert space comes roaring back.”

    Unless the dynamical system couples to a continuum of vacuum states, or (equivalently?) a thermal bath, or (equivalently??) is a product-state pullback. For some reason (yet what might that reason be?) Nature requires that both her external reality and human laboratory experiments respect these coupling-to-continuum constraints. That’s why it’s been heartening in recent years (for us system engineers) to witness the gradual weakening of theoretical faith in the absolute reality of unitary evolution on finite-dimensional Hilbert spaces!

  174. Bram Cohen Says:

    Brady, a vortex is not a particle, it’s a phenomenon across a whole area, like how a sound is a pattern of pressure or a differential in temperature is a potential energy source. The point of intersection of the two blades of a scissors can move forward faster than the speed of light, but that isn’t a violation of the speed of light limit, because that point isn’t a particle, it’s a phenomenon which changes what particles it’s talking about over time.

  175. Robert Brady Says:

    Bram #174. Yes. Well put. A vortex is able to escape Bell’s inequality because it is a phenomenon across a whole area. But it has a duality. An ideal vortex is completely characterised by its central position and circulation, and so it can be (and, in fluid mechanics, is) treated like a 2-D particle. Sonons have the same duality, in 3D.

    To settle this, would you accept an explicit demonstration that sonon quasiparticles have spin-half symmetry and behave precisely like the quantum mechanical particles analysed in Bell’s original paper, including violating Bell’s inequality?

  176. Mitch Says:

    “Bell’s Theorem? Just a flesh wound!”
    Scott. You seem like a smart guy who knows his way around QM, so I have a few questions about the “Bell’s theorem” proof.

    Bell main criticism of von Neumann’s no-go theorem is as follows:

    “The essential assumption can be criticized as follows. At first sight the required additivity of expectation values seems very reasonable, and it is rather the nonadditivity of allowed values (eigenvalues) which requires explanation. Of course the explanation is well known: A measurement of a sum of noncommuting observables cannot be made by combining trivially the results of separate observations on the two terms — it requires a quite distinct experiment.”

    Yet in every “proof” of Bell’s theorem I’ve come across, expectation values from QM are simply combined linearly in an inequality expression (which is valid BTW) to claim violation. So when Bell wrote in is argument against von Neumann that:

    “It was not the objective measurable predictions of quantum mechanics which ruled out hidden variables. It was the arbitrary assumption of a particular (and impossible) relation between the results of incompatible measurements either of which might be made on a given occasion but only one of which can in fact be made.”

    Why is this not also a criticism of Bell’s own theorem? How can Bell’s theorem be valid if the proof relies on a linear combination of expectation values, of incompatible measurements contrary to the principles of QM?

  177. Brian Rom Says:

    So, regarding the pissing contest that has been so entertainng to read, what has transpired over the past couple of years? To paraphrase Lear, ‘Who’s in? Who’s out?’. Enquiring minds want to know!

  178. Yatima Says:

    Brian #177

    Well, looks like they have dropped out of warp at The Register:

    http://www.theregister.co.uk/2015/03/09/quantum_computers_fail/

    (Btw. The above comment thread is instructive, especially Comment #172)

  179. Darian S Says:

    People seem to be demanding that more quantum phenomena be described mathematically in classical terms before they buy.

    But if the great Feynman did indeed say, paraphrasing a bit, that the very heart of all of quantum mechanics lie in the double slit experiment and he was certain it was impossible to replicate classically… then seeing it replicated classically basically is the beginning of the toppling of a house of cards.

    That is unless Feynman was either a.)wildly confused about quantum mechanics or b.) didn’t actually say that.