## Collaborative Refutation

At least eight people—journalists, colleagues, blog readers—have now asked my opinion of a recent paper by Ross Anderson and Robert Brady, entitled “Why quantum computing is hard and quantum cryptography is not provably secure.” Where to begin?

- Based on a “soliton” model—which seems to be
*almost*a local-hidden-variable model, though not quite—the paper advances the prediction that quantum computation will never be possible with more than 3 or 4 qubits. (Where “3 or 4″ are not just convenient small numbers, but actually arise from the geometry of spacetime.) I wonder: before uploading their paper, did the authors check whether their prediction was, y’know,*already falsified*? How do they reconcile their proposal with (for example) the 8-qubit entanglement observed by Haffner et al. with trapped ions—not to mention the famous experiments with superconducting Josephson junctions, buckyballs, and so forth that have demonstrated the reality of entanglement among many thousands of particles (albeit not yet in a “controllable” form)? - The paper also predicts that, even with 3 qubits, general entanglement will only be possible if the qubits are not collinear; with 4 qubits, general entanglement will only be possible if the qubits are not coplanar. Are the authors aware that, in ion-trap experiments (like those of David Wineland that recently won the Nobel Prize), the qubits generally
*are*arranged in a line? See for example this paper, whose abstract reads in part: “Here we experimentally demonstrate quantum error correction using three beryllium atomic-ion qubits confined to a linear, multi-zone trap.” - Finally, the paper argues that, because entanglement might not be a real phenomenon, the security of quantum key distribution remains an open question. Again: are the authors aware that the most practical QKD schemes, like BB84, never use entanglement at all? And that therefore, even if the paper’s quasi-local-hidden-variable model were viable (which it’s not), it
*still*wouldn’t justify the claim in the title that “…quantum cryptography is not provably secure”?

Yeah, this paper is pretty uninformed even by the usual standards of attempted quantum-mechanics-overthrowings. Let me now offer three more general thoughts.

First thought: it’s ironic that I’m increasingly seeing eye-to-eye with Lubos Motl—who once called me “the most corrupt piece of moral trash”—in his rantings against the world’s “anti-quantum-mechanical crackpots.” Let me put it this way: David Deutsch, Chris Fuchs, Sheldon Goldstein, and Roger Penrose hold views about quantum mechanics that are diametrically opposed to one another’s. Yet each of these very different physicists has earned my admiration, because each, in his own way, is *trying* to listen to whatever quantum mechanics is saying about how the world works. However, there are also people all of whose “thoughts” about quantum mechanics are motivated by the urge to plug their ears and *shut out* whatever quantum mechanics is saying—to show how whatever naïve ideas they had before learning QM might still be right, and how all the experiments of the last century that seem to indicate otherwise might still be wiggled around. Like monarchists or segregationists, these people have been *consistently* on the losing side of history for generations—so it’s surprising, to someone like me, that they continue to show up totally unfazed and itching for battle, like the knight from *Monty Python and the Holy Grail* with his arms and legs hacked off. (“Bell’s Theorem? Just a flesh wound!”)

Like any physical theory, *of course* quantum mechanics might someday be superseded by an even deeper theory. If and when that happens, it will rank alongside Newton’s apple, Einstein’s elevator, and the discovery of QM itself among the great turning points in the history of physics. But it’s crucial to understand that that’s not what we’re discussing here. Here we’re discussing the possibility that quantum mechanics is wrong, not for some deep reason, but for a *trivial* reason that was somehow overlooked since the 1920s—that there’s some simple classical model that would make everyone exclaim, “oh! well, I guess that whole framework of exponentially-large Hilbert space was completely superfluous, then. why did anyone ever imagine it was needed?” And the probability of *that* is comparable to the probability that the Moon is made of Gruyère. If you’re a Bayesian with a sane prior, stuff like this shouldn’t even register.

Second thought: this paper illustrates, better than any other I’ve seen, how despite appearances, the “quantum computing will clearly be practical in a few years!” camp and the “quantum computing is clearly impossible!” camp aren’t *actually* opposed to each other. Instead, they’re simply two sides of the same coin. Anderson and Brady start from the “puzzling” fact that, despite what they call “the investment of tremendous funding resources worldwide” over the last decade, quantum computing *still* hasn’t progressed beyond a few qubits, and propose to overthrow quantum mechanics as a way to resolve the puzzle. To me, this is like arguing in 1835 that, since Charles Babbage *still* hasn’t succeeded in building a scalable classical computer, we need to rewrite the laws of physics in order to explain why classical computing is impossible. I.e., it’s a form of argument that only makes sense if you’ve adopted what one might call the “Hype Axiom”: the axiom that any technology that’s possible *sometime* in the future, must in fact be possible within the next few years.

Third thought: it’s worth noting that, if (for example) you found Michel Dyakonov’s arguments against QC (discussed on this blog a month ago) persuasive, then you *shouldn’t* find Anderson’s and Brady’s persuasive, and vice versa. Dyakonov agrees that scalable QC will never work, but he *ridicules* the idea that we’d need to modify quantum mechanics itself to explain why. Anderson and Brady, by contrast, are so eager to modify QM that they don’t mind contradicting a mountain of existing experiments. Indeed, the question occurs to me of whether there’s *any* pair of quantum computing skeptics whose arguments for why QC can’t work are compatible with one another’s. (Maybe Alicki and Dyakonov?)

But enough of this. The truth is that, at this point in my life, I find it infinitely more interesting to watch my two-week-old daughter Lily, as she discovers the wonderful world of shapes, colors, sounds, and smells, than to watch Anderson and Brady, as they fail to discover the wonderful world of many-particle quantum mechanics. So I’m issuing an appeal to the quantum computing and information community. *Please*, in the comments section of this post, explain what you thought of the Anderson-Brady paper. Don’t leave me alone to respond to this stuff; I don’t have the time or the energy. If you get quantum probability, then **stand up and be measured!**

Comment #1 February 4th, 2013 at 9:48 am

I think the issue is that most of us don’t have eight people nagging us to comment on obviously wrong papers, so we just do what you probably want to do: ignore them.

Comment #2 February 4th, 2013 at 10:58 am

I’m not sure if I have anything further than the arguments you’ve made already to say about the main message of the paper. But here’s a comment on a more trivial point: at the end of Section 1, the authors claim that “Researchers are now starting to wonder whether geometry affects entanglement and coherence; the first workshop on this topic was held last year”.

I attended the workshop they cite, which was emphatically not about this issue. As can be seen from the schedule, the workshop was concerned with mathematical questions to do with the geometry of quantum states in the standard Hilbert space picture of quantum mechanics, rather than any physical issues concerning whether the underlying qubits were arranged in a linear trap, etc.

Comment #3 February 4th, 2013 at 11:20 am

Mateus and Ashley: Thanks!!

Comment #4 February 4th, 2013 at 11:24 am

it’s ironic that I’m increasingly seeing eye-to-eye with Lubos Motl—who once called me “the most corrupt piece of moral trash”From Lubos Motl, that’s high praise. Are there even 100 living people he has described in such positive terms? So it’s not at all ironic that would bias you to agree with him.

Comment #5 February 4th, 2013 at 11:52 am

Just ignore them. I have to admit I have great fun watching you debunk Joy Christians and the likes, but in the end, it is a futile and time-consuming task. It’s best just to ignore them and enjoy our little treasures. They’re so much more interesting!

Comment #6 February 4th, 2013 at 12:50 pm

Unfortunately, I think it’s an example of this phenomenon:

http://www.smbc-comics.com/index.php?db=comics&id=2556

Ross Anderson is quite well known in computer security, mainly for his work on banking security. In fact, if you hadn’t written this piece I probably would have been the ninth person to ask you about it — it was clearly wrong, but from someone without a prior history of crankiness.

It’s always disappointing when someone you respect does something silly like this.

Comment #7 February 4th, 2013 at 1:07 pm

Anon J. Mouse #6: Thanks! I intentionally didn’t look up who Anderson and Brady were before writing the post, since I didn’t want that to bias me. But, yes, the amount of respectful attention this obviously-wrong paper seemed to be getting did surprise me.

Incidentally, I just found Anderson’s blog, which includes a comment by Jonathan Oppenheim making substantially the same points as in my post.

Comment #8 February 4th, 2013 at 1:51 pm

There is another point to make about scientific revolutions such as Newton’s laws or quantum mechanics. Yes, they can be supplanted or revised, but it’s not as if there is any turning back. The new revolution is almost always even less palatable to the old guard than the old one.

(OTOH I do not understand the statement that BB84 does not use entanglement. I guess that there are non-entanglement versions, but it looks like the original BB84 does use entangled Bell pairs.)

Comment #9 February 4th, 2013 at 1:55 pm

Greg #8: No, BB84 just requires sending individual, unentangled qubits in one of the four states |0⟩, |1⟩, |+⟩, or |-⟩. (Indeed, the lack of any need for entanglement is one of the main reasons why the protocol is already practical today.) Interestingly, I understand that many of the

security proofsfor BB84 introduce entanglement as a formal convenience, but the entanglement never appears in the actual protocol itself.Comment #10 February 4th, 2013 at 2:00 pm

How about that! I skimmed the link and totally misread it. Of course you’re right.

Comment #11 February 4th, 2013 at 2:55 pm

“Like monarchists or segregationists, these people have been consistently on the losing side of history for generations…”

Unfortunately, I am not sure that monarchists or segregationists have been consistently on the losing side… At least, those who are against democracy and human rights often win — just look at Ancient Greece, Ancient Rome, independent Italian City-States, many democratic European governments in the first half of the 20-th century (which lost to dictatorships), or modern Russia (which is now much less democratic than it used to be 15 years ago). Also I’m afraid that now science is losing to pseudoscience. Most people in the world believe that “evolution is just а [wrong] theory,” “Big Bang never happened,” homeopathy works and biofields exist.

It is surprising, however, that this paper attracted that much attention. I try to ignore articles that are based on clearly wrong assumptions like Quantum Mechanics or General Relativity is wrong, evolution never happened etc. Why do people pay attention to such papers ?

Thanks for an interesting post!

Comment #12 February 4th, 2013 at 3:54 pm

Yury #11: Good point! I should have clarified that monarchists, segregationists, and anti-QM folks have been consistently on the

intellectuallylosing side. But they can win plenty of shallow “victories,” just as con-men or street thugs can.Comment #13 February 4th, 2013 at 4:18 pm

LOL, Scott. I am sure your thinking engines are good enough to see eye-to-eye with me.

By remembering my unflattering quote, you’re also showing some sense of history and long-term memory.

http://motls.blogspot.com/2006/12/academia-and-scientific-integrity.html?m=1

More than 6 years ago, the unflattering words were revealed because of your decision to write anything (including any lie) about quantum gravity and string theory that someone pays you for.

I don’t know whether this decision to be bought still holds. If it does, my assessment of course still holds as well whether or not we see eye-to-eye with one another.

Comment #14 February 4th, 2013 at 5:05 pm

This might be the single greatest compliment I’ve ever received.Lubos says he’s sure my “thinking engines” are good enough to see eye-to-eye with him! Callooh! Callay!And Lubos, in return for your generous compliment, I have some good news. As a result of major life changes—getting married, having a baby, etc.—I have abandoned my previous materialistic, money-grubbing ways. I’m now strictly a man of principle. And as such, no amount of money could ever induce me to abandon my total, principled commitment to Loop Quantum Gravity.

OK, OK, I’m kidding about the last part. In fact, I have a much better appreciation now for the achievements of string theory than I did back in 2006, partly due to a meeting in Florence where Brian Greene spent 4 hours explaining them to me and others. I came away genuinely impressed, convinced that string theory and especially AdS/CFT are unequivocally a step forward in our understanding of the universe, even though we have a great deal more to learn. I’m

notready to say that alternative ideas like LQG are garbage and have nothing worthwhile to contribute, let alone that global warming is a sham, but maybe Lubosification is a process that will happen to me one step at a time.Comment #15 February 4th, 2013 at 6:26 pm

I suspect that LQG isn’t absolute garbage either, and I also cannot compare it to string theory on any direct authority either. But speaking as an external observer, it really does look like LQG is (or maybe was) a Nader-like quest to compete with string theory. I.e., a quest in which simply being invited into the debate is the first major goal, even if it has no good consequence or even negative consequences for the actual outcome of the debate.

Of course, also speaking as an external observer, global warming looks like anything but a sham, in fact global warming denial looks like a sham.

Comment #16 February 4th, 2013 at 7:37 pm

Greg #15: My understanding of string theory is that it’s “what you’d inevitably come up with” if you took quantum field theory and perturbation theory as your fundamental starting points, then tried to tame divergences by replacing the point particles by extended objects. When you do that, you get some wonderful things that weren’t explicitly put in (e.g., the graviton), but also various aspects that

seemto require ugly kludges to make them consistent with observed reality. Meanwhile, my understanding of LQG is that it’s “what you’d inevitably come up with” if you took GR and its demand for background-independence as your fundamental starting points, and tried to create a quantum theory satisfying that demand while leaving aside the details of particle physics. When you do that, you get something wonderful that wasn’t explicitly put in (spacetime discreteness), but also various aspects thatseemto require ugly kludges to make them consistent with observed reality.If I’m right, then despite their incompatibility both with each other and (probably) with the ultimate truth, neither string theory nor LQG is nearly as “arbitrary” as they might seem to an outsider. If we had to pick one of the two that’s had more technical successes, that would be string theory, but that doesn’t mean LQG has had

notechnical successes.Comment #17 February 4th, 2013 at 8:32 pm

Scott – Except with one colossal difference: One is conjectured to be mathematically viable, and the other one is conjectured not to be.

Comment #18 February 4th, 2013 at 8:42 pm

To be more precise, superstring theory satisfies a formidable array of mathematically rigorous consistency checks. It is either entirely or very nearly rigorously defined as a perturbative model of quantum gravity (in 9+1 dimensions). There are a ton of technical mathematical successes.

Whereas with LQG, I’m not sure that there really are any technical successes. Some consistency checks have been claimed/published, but there are arguments that they are all superficial. I have also heard that renormalization theory speaks against the viability of a macroscopic limit of LQG, although I don’t know a whole lot about it.

Comment #19 February 4th, 2013 at 11:32 pm

Yesterday, when I first saw this post, not a single comment had yet made an appearance. The only thing I could think of, by way of a reply, was the following old one which I happened to remember, wanted to write down as my answer, but, somehow, didn’t (I decided to wait for other comments to appear, first). Anyway, the joke goes:

Masochist to sadist: Oh, please, please,

hitme!Sadist to masochist: No, I

won’t!…

Ok, I will try to read that paper. … Yeah, in a way, I have known of that dancing droplet thingie since the time it came out, and wasn’t impressed much (if at all) by it. … Anyway, Scott’s point #2 and #3 seem to be right on.

Ajit

[E&OE]

Comment #20 February 5th, 2013 at 1:38 am

Scott: …

Lev Tolstoy: Every happy families are similar, whereas any unhappy family is unhappy in a different way.

Charles Babbage: A different kind of physics was discovered (and was actually necessary ) before the classical computers became practical.

Comment #21 February 5th, 2013 at 2:04 am

I have to admit, I was pretty shocked to see this paper, and it makes you wonder about this guy. I get crackpots occasionally emailing me with their theory of how Bell’s theorem is wrong, or relativity is wrong, or quantum computation/crypto is wrong. But rarely do you find a claim of all three! Plus the mention of Bohemian mechanics and black holes in a quantum paper, pretty much tick five of the seven crackpot boxes. The only thing that’s missing is that it be written in several different fonts, and start with an introduction about how the author is a misunderstood genius and that mainstream scientists are too conservative to understand the truth of his theory, but that just like Einstein, he’ll be proven right in the end.

People like Ron Rivest were hyping up this thing. Amazing!

Comment #22 February 5th, 2013 at 2:13 am

Scott #16 a very nice summation of the String and LQG efforts. I will probably have to quote this at some point.

As to the paper, I believe there is something intrinsically sinister going on when people try to approach QM from a topological angle. Apparently it makes people get all wobbly in the head. How else to explain Joy Christian and now these apparently respectable fellows, pushing these rather strange ideas?

Comment #23 February 5th, 2013 at 5:57 am

Oops. In my above reply, please take it as #1 and #2 (of Scott’s points). Those two seem stronger points than #3 to me, with #2 being the strongest. (Above, I just got the point numbers wrong). … Sorry about that.

… BTW, whenever any new view or theory to resolve the quantum riddles is put forth, I invariably end up wondering how a simple computational model simulating the essential physics of it might look like. Ditto here.

The simulation being presented here, if it can be called that, is physical. That, in part, is a problem: with this kind of a model, the authors don’t have to directly dvelve in detail how the boundary/initial conditions are to be handled. Writing a C++ program would force them to be explicit with respect to all such details.

If such a program is presented, then our (my!) task will become that much easier: the only remaining task will be to examine the points on which the conventional (say the Copenhagen) view and the new view differ.

By its nature, a C++ simulation will have to capture the new concepts (objects/classes), and make them work in a new way (methods/algorithms). This nature of the computational simulation forces one to be explicit about the theoretical content.

In contrast, what is presented as a physical simulation need not be so directly concerned with presenting the new theoretical ideas in their completeness; it could get away with being rather suggestive.

Ajit

[E&OE]

Comment #24 February 5th, 2013 at 6:22 am

To be sure that you don’t miss my comments on your Lubošification, see

http://motls.blogspot.com/2013/02/lubosification-of-scott-aaronson-is.html?m=1

Comment #25 February 5th, 2013 at 8:02 am

Although I still agree with the issues raised here on the Ross Anderson and Robert Brady paper (ie, I think quantum computing and quantum crypto will be viable in the future), I don’t think their argument requires that BB84 use entanglement.

If I understand correctly (I only did a quick read, so that’s a big “if”), they are against even the Bell inequalities. This would of course allow for a local hidden variable approach which would destroy quantum information, as it would reduce it to classical information. If this were the case, then it may be possible to use some crazy measuring device to measure the signals in the BB84 protocol in any basis, without disturbing the original signal.

Now, I am pretty sure that the Bell inequalities do hold true, and so quantum crypto is still secure. I don’t really understand how they are claiming there is a flaw in the logic of the Bell inequalities. I just thought in the interest of fairness that point should be brought up.

Comment #26 February 5th, 2013 at 9:10 am

Tyler #25: Fine, but that would require a completely different argument than the one they actually made!

If they said: we think QM is completely wrong, ergo Eve can violate the No-Cloning Theorem, ergo she can break BB84, then their starting premise might still be bollocks, but at least steps 2 and 3 would follow logically.

Instead they said: “As the experiments done to test the Bell inequalities have failed to rule out a classical hidden-variable theory of quantum mechanics such as the soliton model, the security case for quantum cryptography based on EPR pairs has not been made.” Nowhere in the article do they indicate any awareness that the main QKD schemes are

notbased on EPR pairs, and indeed the paper title says “…quantum cryptography is not provably secure,” with no mention of EPR pairs. So this seemed like a case of straightforward confusion.Comment #27 February 5th, 2013 at 9:37 am

What’s striking to me, as a non-physicist, is that they never discuss why the Bell experiments obtained the correlations predicted by orthodox quantum mechanics. They explain a loophole by which a local theory could be consistent with the experimental results. But (on a quick skim) I see no recognition of how amazing it is that this local theory manages to recreate the precise predictions of the nonlocal theory.

Comment #28 February 5th, 2013 at 9:55 am

I am afraid Haffner doesn’t report 8 qubits that could be used in quantum computation. Most of the ions are in the unexcited (ground) state, but the usual procedure in many-body theory is to subtract out the ground state. This can also be understood in terms of the modes of oscillation of multiple coupled oscillators, referred to in the paper by Ross and myself, where most or all of the oscillators act coherently as a single entity corresponding to the ground state.

A similar effect occurs in superconductors, where I did theoretical and experimental work as a fellow of Trinity College alongside Brian Josephson. The phase difference across a Josephson junction between two bulk superconductors can represent at most one qubit. This is true even though the superconductor contains very many entangled electrons. Brian’s original work describing this phase transition is available here. You can of course analyse the same system as if it contained a large number of qubits, one for each pair of electrons, as long as you remember they are not independent. But nobody considers counting them individually, as if there were millions of qubits for quantum computation, because they are all correlated with a single phase and act as a single entity.

Comment #29 February 5th, 2013 at 10:50 am

Robert Brady #28: Your response reminds me of the creationists who state categorically that “there are no missing links.” Then when people say, “What about

Australopithecus? What aboutHomo erectus? etc.,” they reply, “well, that one’sreallya human. That one’sreallyan ape. Neither one is transitional between the two.” Since they get to make up the rules as they go along, there’s no way they can ever be proven wrong.In a similar way, you claim that there’s no evidence for entangled states of more than 3 or 4 qubits. Then people immediately respond, what about

thisexperiment? What aboutthatone? In each case, you have somea posteriorireason why that experiment doesn’t count: yes there are hundreds of particles in a cat state, but they’re not behaving independently so they don’t count as “qubits.” And what about, say, the “cluster-like” quantum states discussed in this paper, which involve many-particle entanglementnotin a collective cat-like degree of freedom? I assume you have some other reason why those don’t count.What you need, and don’t have (as far as I’ve seen), is a theory that would explain

a prioriwhat sorts of many-particle entanglementwouldcount.Comment #30 February 5th, 2013 at 11:15 am

Scott, when we exchanged email privately before you made this blog post, I asked you to point me to any experimental paper that challenged the soliton theory of the electron. Rather than continuing that discussion in a civilised fashion, you chose to post this rant instead, inviting your followers to be abusive. But as Robert points out, the paper you cite does not report eight qubits at all. I’ve had similar conversations by email with other scientists who’ve privately pointed out other papers claiming multiple qubits; but most have already been attacked by other workers in the field, as here.

I’m saddened that your response to Robert’s post was simply abusive. Anyone who’s interested in a substantive discussion of this issue may find it on light blue touchpaper, and more generally at the.Emergent Quantum Mechanics workshop.

Comment #31 February 5th, 2013 at 12:03 pm

I’m a little confused about the focus on that particular eight qubit experiment. There are a number of eight qubit experiments out there, particularly in quantum optics (see for example Jian-Wei Pan’s experiments).

Comment #32 February 5th, 2013 at 12:35 pm

Joe #31: Thanks! I mentioned the Haffner et al. experiment only because it was the first >4-qubit experiment not in liquid NMR that popped into my head. You’re right that I also could’ve mentioned optical experiments, but I’m pretty sure it wouldn’t have mattered. Anderson and Brady are playing the game where first someone else proposes a many-qubit experiment—

anysuch experiment—then they think up a creative reason why it doesn’t contradict their model. They’ve left the realm of Popperian falsifiability.Comment #33 February 5th, 2013 at 2:31 pm

Dear Scott,

I don’t believe it’s possible for a well-defined model to avoid falsification in this way. At most, they may have a vague template of a model superimposed on tons of wishful thinking that looks compatible with their not-too-comprehensive consistency checks.

However, it’s totally obvious that if one proposes any particular model that is “qualitatively inequivalent” to proper quantum mechanics, it has to contradict some experiments that are well-known and easy to do. In fact, one may eliminate whole, almost complete classes of these QM-inequivalent models.

For example, if they admit that all possible states of 2 electrons are described correctly by quantum mechanics (including their entanglement), and one imposes any kind of mild locality or Lorentz symmetry which is tested in many ways as well, it’s clear that the laws for 2 electrons may only be extended in a unique way to an arbitrary number of electrons simply because any subset of 2 electrons in the larger set has to agree with quantum mechanics.

It’s also silly to say that we can’t study any states with more than 4 entangled qubits in Nature. Take any molecule such as benzene – I could pick pretty much any other molecule but I want to be specific. It has a hexagon of carbon atoms. Each carbon atom has 4 valence electrons to share; for each carbon atom, 1 of these 4 is attached to a hydrogen atom – organized to a hexagon radially attached to the carbon hexagon.

This leaves 3 free valence electrons for each carbon atom. Each carbon atom is connected to the neighboring carbon atoms – the bond is “double” for one of the carbon atoms and “single” for the other one. In total, we have a configuration of at least 6 qubits here. The low-lying states of benzene distinguish two states, the worth of 1 qubit – because the double and single bonds have to alternate and there are just two options. Indeed, the sum and differences of these two alternating arrangements give two energy levels of the molecules we may test.

There’s one qubit visible at low energies except that any model that denies that these are built from a larger number of electrons/qubits that may be organized/entangled in many other ways a priori will contradict the locality to the extent that it will really be incompatible with the atomic theory of matter itself! It’s nothing else than the atomic theory that implies that the properties of molecules are constructed of – and subsets of – properties of the collections of atoms from which the molecules are built.

There just can’t possibly be any model that would get the right low-energy levels of the benzene molecule while it would deny the a priori existence of many electrons storing an arbitrary number of qubits of quantum information a priori. If someone thinks that their model can describe such things, he should write a paper about the description of the molecule in a “completely different way” than QM.

It’s not just the benzene molecule. It’s any molecule. It’s any system with many particles. Condensed matter physics gives a whole new perspective on this issue. The people who don’t use the regular multiparticle quantum mechanics and/or quantum field theory (or its upgrade, string theory, and all these three frameworks are really equivalent when it comes to the analysis of these low-energy states) are really abandoning modern physics. To claim that they have an alternative, they have to start from scratch and they have to offer their – totally different – explanation for every single observation in modern science.

It won’t be enough to describe one particle or one force because they are apparently messing with the very way how composite systems and interactions are constructed out of the smaller ones, too. So they have to separately check whether larger, composite systems according to their theory behave in agreement with experiments, too. And of course that the answer is a resounding No.

What Dr Anderson and Dr Brady do is typical pseudoscience in which one decides that the right theory must be destroyed and declared wrong and they construct an alternative except that they set much lower standards for the alternative and don’t even try to check whether the alternative is capable of describing at least 1% of the things that the right theory is able to describe, at least the elementary things.

Cheers

LM

Comment #34 February 5th, 2013 at 2:40 pm

Since Anderson requests a serious discussion instead of ridicule, here is one: Reference [36] in Anderson-Brady reports a violation of the Bell-CHSH inequality under quite strict conditions, and the rebuttal given to that finding is incomprehensible.

Comment #35 February 5th, 2013 at 4:01 pm

Brady seems to be simply misguided about the basics of quantum physics. At behest of my quantum computation colleague I’ve posted a bit more detailed critique:

http://seeking-mind.blogspot.com/2013/02/no-threat-to-quantum-cryptography-at.html

Comment #36 February 5th, 2013 at 4:07 pm

I asked a physicist colleague (Vyacheslavs Kaschcheyevs – the best quantum theorist in our country) for his opinion on this paper. Here are his comments:

http://seeking-mind.blogspot.com/2013/02/no-threat-to-quantum-cryptography-at.html

The conclusion is the same as Scott’s but Vyacheslavs also points out that Anderson&Brady ideas are refuted not only by quantum computing experiments with more than 3 qubits but also by a large number of quantum physical phenomena which have been tested long before quantum computing was invented.

Comment #37 February 5th, 2013 at 4:11 pm

It would be much cooler to promote good papers on your blog, and let the bad ones molder. Nobody cares about this paper. (And please, leave poor Lubos alone!)

Comment #38 February 5th, 2013 at 4:21 pm

John #37: When I

havepromoted good papers on this blog, people have accused me of unseemly “hype.” That is, when they commented at all — I didn’t get nearly as many entertaining reactions as when I’ve ripped into bad papers!But, OK, point taken.

Comment #39 February 5th, 2013 at 4:33 pm

Lubos #33: I agree that I was being overly generous to Anderson and Brady when I used the word “model” to describe their ideas. That they

don’thave a well-defined model is precisely what lets them wiggle free of whatever experiments people bring to their attention.On the other hand, I respectfully disagree with the following statement of yours:

it’s totally obvious that if one proposes any particular model that is “qualitatively inequivalent” to proper quantum mechanics, it has to contradict some experiments that are well-known and easy to do.

No, it’s logically possible that someday, someone will invent a theory that agrees with QM on all experiments that have already been done or can be easily done with current technology, but that disagrees with QM on the states that arise (for example) in large-scale implementations of Shor’s factoring algorithm. I have no idea what such a theory would look like—and even if someone constructed one, I’d give long odds against its being true, unless it was somehow even more elegant and mathematically compelling than QM. And

certainly, one would need to work thousands of times harder than Anderson and Brady are working to construct such a theory that gave sensible results on current experiments. But I do regard the problem of ruling out such theories—what I once called Sure/Shor separators—as a wonderful scientific project, and indeed, as one of the mainintellectualreasons to try to build scalable quantum computers. It’s sort of like proving P≠NP or the Riemann Hypothesis—other examples where “reasonable people already agree on the answer,” and yet we stand to learn a great deal from the journey.Comment #40 February 5th, 2013 at 6:55 pm

Does this paper deny that Bell inequality violations happen in the real world? Claiming that there isn’t any experimental evidence for that is… a bit of a stretch.

Comment #41 February 5th, 2013 at 8:27 pm

Scott #29. The test is very specific. Can you do a quantum computation representing numbers greater than 16?

For example, the factors of 21 are 3 and 7, both of which are less than this limit, and so this computation can be performed, unlike a factorisation of, say, the product of two primes greater than 100.

The experiments discussed here do not attempt to do any computation. Instead they report individual correlated or entangled items, which you call ‘qubits’ as if they could be used in a computation. In my field of many-body theory there are millions upon millions of similar entangled entities, for example the entangled spins in a ferromagnet or the electron pairs in a superconductor. They are not called qubits because you cannot do a quantum computation with them. To the contrary, they are called a ground state and are usually subtracted out.

I do not think it is creationism to point out, with references, why you cannot do a computation with these entities, nor is our paper unfalsifiable for the same reason.

Comment #42 February 5th, 2013 at 8:41 pm

Bram – The tone of the paper is, yes, there have been Bell violation experiments, but the experiments have loopholes. The authors don’t really believe quantum probability.

(As I said, I can’t make sense of their objection to reference [36].)

Comment #43 February 5th, 2013 at 8:59 pm

Robert Brady #41: I see. 21 doesn’t count since, although it’s greater than 16, its prime factors are not. Can I have you on record that, if someone uses Shor’s algorithm to factor 51 into 3×17,

thatwill change your mind? (Or will it still not count since, when written in binary, 51 consists only of 1’s and 0’s?)Regarding many-body experiments: “usability for computation” is not some magical pixie-dust that infuses certain physical systems while leaving all the others untouched. When you have a large entangled quantum state of hundreds or thousands of particles, the burden is on the QM

skeptics—i.e., the scientific radicals—to explain exactly how to account for that state and its evolution within their classical model. It’s not on the people who accept standard QM—i.e., the “conservatives”—to demonstrate that, in principle (if notyetin practice), you could do all the things with the state that standard QM says you could do.Comment #44 February 5th, 2013 at 9:11 pm

The wrangling with regard to the feasibility (or not) of fault-tolerant quantum computing has much to lean (as it seems to me) from the wrangling with regard to the reality (or not) of anthropogenic climate-change:

• Little is gained when the strongest enthusiasts confront the weakest skepticism, and

• Little is gained when the strongest skeptics confront the weakest enthusiasm.

ConclusionThe skepticism of the Anderson/Brady preprint is insufficiently strong (in its mathematical physics) to justify the opprobrium that quantum computing enthusiasts are heaping upon it.What would be deeply thrilling (to me) would be an arxiv preprint that expressed skepticism of quantum computing comparably forceful to Edgar Djikstra’s skeptical computer science essay

… perhaps along the linesGo To Statement Considered HarmfulHilbert Space Considered Harmful.The essay

Hilbert Space Considered Harmfulwould replace Djikstra’s dictumwith an analogousGOTO non fingoC^n non fingo, that is, a demonstration that for very many (all?) real-world dynamical systems, the effective state-space is very much smaller than a Hilbert space.This would set the stage for faith in the absolute physical reality of Hilbert space, not to be overthrown in a radical revolution, but rather to fade gracefully into irrelevance, as more powerful mathematical methods of dynamical simulation replace it … rather like the moderating effect that democracy exerts upon

Europe’s twelve monarchies!Comment #45 February 5th, 2013 at 11:48 pm

Lubos #33 and Scott #39,

I agree with Scott that Lubos’ statement

it’s totally obvious that if one proposes any particular model that is “qualitatively inequivalent” to proper quantum mechanics, it has to contradict some experiments that are well-known and easy to do.

is not at all clear. This is *exactly* what happened with general relativity. It’s qualitatively different than Newtonian physics, it produced results that agreed with all the well known results, but predicts very different results in the strong-field, high speed regimes. (The one existing test at the time, the precession of the perihelion of Mercury, is not at all an easy test. There’s a classical explanation as well involving the J2 moment of the sun, which is hard to rule out since the Sun does not rotate as a solid body).

Comment #46 February 6th, 2013 at 12:48 am

Slightly related: Does all probability follow from QT?

http://www.sciencedaily.com/releases/2013/02/130205151450.htm

Comment #47 February 6th, 2013 at 1:16 am

Dear Scott #39, you write:

“No, it’s logically possible that someday, someone will invent a theory that agrees with QM on all experiments that have already been done or can be easily done with current technology, but that disagrees with QM on the states that arise (for example) in large-scale implementations of Shor’s factoring algorithm…”

Sure, it’s also logically possible that someday, a creationist theory explaining all the data like evolution and many more will be proposed, too. It’s logically possible that Elvis Presley will send us greetings from the Moon, saying that he has enjoyed it over there.

Logically possible is almost everything in science. A different question is whether it’s at least likely enough so that you would expect such a thing occur once per age of the Universe if it occurred every second. And in this sense, all the three things I mentioned are de facto impossible.

I understand that you consider a quantum computer running Shor’s or different algorithm to be something in between the Earth and Heaven – after all, we don’t have it yet. My suspicion is that you also want to say that its possible existence is disputable because this makes your field look like unsettled, ongoing research near the frontiers of physics knowledge – which it’s actually not because in reality, quantum computation is just an advanced engineering application of rudimentary physics insights of the 1920s and 1930s.

However, if you have an objective reason why you think that a quantum computer running Shor’s algorithm is “extreme” so that QM could fail in it, well, I am certain that you won’t be able to define any physically natural – non-spiritual – quantity according to which it would be extreme. It may have 100 or 1,000 qubits and this may look large relatively to the devices that have already been constructed.

However, this number of qubits/particles is tiny relatively to the numbers in other physics systems where quantum mechanics has been tested and confirmed. Take a piece of metal, it has 10^{26} atoms and a ground state. The ground state is a particular (and rather generic, from an a priori viewpoint) linear superposition of tensor products of states of many electrons (and other particles) and it has the properties dictated by quantum mechanics.

Because the ground state of this system – and many other, very different systems – is just a “rather random” linear combination of the basis vectors you may obtain from 10^{26} qubits or any other large number, it makes it pretty much inevitable that 1) all the other basis vectors in the multi-qubit Hilbert space are allowed states, 2) one can make any superpositions because if one seemingly random (one that only differs by low energy) combination is allowed, and it always works, it pretty much by statistics implies that all of the other also work.

So by checking several (well, millions of) different systems, we have verified that the required Hilbert space indeed grows exponentially with the number of degrees of freedom and that all superpositions are allowed (the superposition postulate holds) because some “random ones” are always allowed. It’s implausible you will find a loophole although we would have to define a “loophole” rigorously to be able to rigorously decide whether such a class of theories may already be fully falsified. (The tail that will remain “unfalsified” will be written as equations of QM plus some tiny corrections, and it will be possible to show that a viable, non-falsified theory is just a built as an unnatural mutation of quantum mechanics.)

Incidentally, if we talk about the “a priori” restrictions on the Hilbert space – the kinematic Hilbert space, so to say – that’s something that folks in quantum gravity know very well. Quantum gravity *does* invalidate the locality in the strict sense (it’s needed for the Hawking radiation to restore the information about the initial state, even though it’s apparently coming from a causally disconnected region). The qubits in region A and qubits in region B, when too dense, aren’t quite independent from each other. One may only achieve those configurations for which the object isn’t “too much mass concentrated within the Schwarzschild radius”. If there’s too much mass/energy in a region, it collapses into a black hole and the quantum computer (and its usual description) breaks down (and is crushed a second later).

But just like we know that this nonlocality and refusal to acknowledge complete independence of various regions exists, we also know that it’s very far and its effect on mundane, low-energy, low-density, quantum-computing-like experiments is negligible. The restriction only occurs when the density of information approaches 1 bit per Planck area (if we measure the surface of the region which is relevant because quantum gravity is holographic) which is 10^{-70} square meters. If you tried to impose restrictions on the Hilbert space of a local quantum field theory that would start to operate at much larger areas, it would be equivalent to claiming that G, Newton’s constant, is much larger i.e. gravity is much stronger. Because the black holes clearly saturate the entropy and they would be restricted, you would reduce your idea about the black hole entropies and it would become impossible for some stars to collapse into black holes – because such a process would disagree with the second law of thermodynamics, and so on.

We have actually lots of other physics reasons and arguments – not just playing with individual qubits at the “fundamental level” – to be sure that our description is right even for heavily multi-body systems. Thermodynamics really allows us to work with entropy and we do measure entropy experimentally. A certain entropy simply does mean that there exist exponentially many mutually orthogonal (classically mutually exclusive) states in the Hilbert/phase space. This may be verified by thermal experiments. To ban most of the microstates would mean to seriously reduce the heat capacity of the object. But we just know that the heat capacity of a proposed design for a quantum computer can’t be much lower than predicted by quantum mechanics. It’s just a piece of matter and similar matter has been subjected to tons of thermal experiments since the 19th century. When cooled near absolute zero, only some degrees of freedom survive but the entropy will still be extensive – it’s been tested with a great accuracy – and in the quantum framework, it means that the dimension of the Hilbert space grows exponentially with the size.

Whether people will be able to overcome all the technical problems with QC – and whether there is a mistake in the proofs that QC is immune against various types of real-world inaccuracies and noise – may be an open question. But the fact that somewhat larger systems with 100 or 1,000 qubits obey the postulates of quantum mechanics as safely as systems with 1 or 2 or 3 or 4 qubits is something I am ready to bet my head upon (as in guillotine) simply because I know that we have verified QM on systems with a small number of particles as well as much larger numbers of particles. QM isn’t just a theory of small systems; it’s a theory of all systems and it’s essential whenever the classical limit is unjustifiable.

I think that if you work 1,000 times harder than Brady and Anderson, you will not get a working, inequivalent, non-quantum description of the phenomena for which quantum mechanics seemed to work. Instead, if your work is impartial, you will end up seeing what I already see now, namely that such an alternative theory just can’t work. Well, maybe you have to work 10,000 more hard than Brady and Anderson but the result – a clear understanding of these basic conceptual issues – is a price that deserves and justifies this 10,000-times-harder work.

All the best

Lubos

Comment #48 February 6th, 2013 at 2:48 am

Luboš #47: As you might recall, I’ve offered $100,000 for a convincing argument that scalable QC is impossible in the physical world, and I don’t have that kind of money to throw around casually. If you like, my offer corresponds, not to my assigning the speculation in my comment an 0.0000000000001% probability (like Elvis on the Moon), but certainly to my assigning it an 0.1% probability or less. I won’t go much lower than that, simply because, just because

wecan’t think of a “non-spiritual” way to separate the states arising in Shor’s algorithm from the many-particle states arising in current experiments, and even have what look like good arguments against that (e.g., your entropy argument, which of course assumes QM), doesn’t tell me there isn’t such a separator that would be obvious to physicists of future generations. Sure, it would require a deformation of QM unlike anything we’ve seen, but I don’t see that it takes us into “Elvis lives” territory. And also, I’m pretty sure an examination of the history of physics would show that discoveries people previously would’ve assigned an 0.0000000000001% probability, havehappenedat least 0.1% of the time.But even if—as you and I strongly predict—the effort to build a scalable QC “merely” results in actually building a scalable QC, I still say we would’ve learned something important. And not merely because of the intellectual inadequacies of the people who had smidgens of doubt. By analogy, if someone proved the Riemann Hypothesis, I wouldn’t say that person had “merely” achieved the same level of certitude as a physicist who’d long ago told all his friends that he was “morally certain” of the hypothesis’ truth, with no proof! Having a proof is qualitatively different.

Comment #49 February 6th, 2013 at 3:33 am

Very interesting paper. This soliton model is something I would like to see thoroughly explored as I am always interested in classical models of QM behavior, I certainly agree that inability of deriving QM from classical physics might simply be a failure of imagination.

As for Bell inequalities, the paper does not deny them, but claims another loophole due to the fact solitons in this model propagate in common density wave background so to speak (if I am getting it right), but I agree they should explain in much more detail how this loophole works and why it lets them circumvent Weihs and Salart experiments.

Comment #50 February 6th, 2013 at 4:07 am

John Sidles #44: The reason I commented on this paper was simply that lots of people asked me to!

In general, though, I disagree with the idea that in any intellectual dispute, we have a moral obligation to respond

exclusivelyto our opponents’ best arguments, and to let their dumbest, most egregiously-wrong arguments pass completely without comment. Yes, if we care about truth we’d better respond to the former. But puncturing the latter can also be a useful service to the public—besides being easy and fun!Comment #51 February 6th, 2013 at 4:10 am

Scott #43:

Robert #41 makes two points. Let me deal with each, separately.

1. I don’t know anything towards settling that point about “16” limit.

My hunch would be that the authors would be wrong in prescribing such a limit. However, since I don’t understand either their theory or TCS in general to the sufficient extent (and that’s why I was insisting on his supplying a C++ program), I am unable to see if the authors have an intuitively unbelievable but rigorously provable result.

For an example of such a theorem (intuitively unbelievable but actually provable), I can cite the case of Huygens’ principle supposedly not working out in 2D. Completely unbelievable, but “true.” (http://www.physicsforums.com/showthread.php?t=148787)

BTW, if you ask me, IMO, that theorem is based on a wrong way of understanding Huygens’ principle. But, yes, if you grant the mathematicians’ definition of what constitutes that principle, then, sure, the proof, by itself, is valid. It’s just that those definitions themselves are different from what people understand on the common-sense physical grounds. Those definitions themselves are only tenously understandable. Further, they (and the 2D limit) can be made completely irrelevant in a physically sound and simpler view of Huygens’ principle.

The point is: since I don’t understand the authors (A+B), I would allow them (at least for the time being) the possibility that they could be on to something similar: something that is only tenuously understandable, and is a minor quirk of theory that doesn’t at all matter in practice (just the way I can always apply Huygens’ principle also in 2D—following another definition of the principle: a

localdefinition).2. However, when Robert (#41) comes to this: “They are not called qubits because you cannot do a quantum computation with them,” he does have an absolutely valid point. He also points out the reason why.

This point of his remains valid, even if the objection concerning the absence of an explicit model for entanglement which Slava (#35) points out, also remains relevant! The authors (A+B) need to respond to that.

Enough, for the time being.

Ajit

[E&OE]

Comment #52 February 6th, 2013 at 5:51 am

Ajit #51:

The deal breaker is that classical models, however beautiful, have no capacity to approach any problem that “orthodox” quantum mechanics solves using the notion of entangled states (aka many-particle superpositions), regardless of interpretation.

The point of my blog post is that this set of problems inaccessible by A+B covers an overwhelming majority of situations to which physicists have every applied the quantum formalism. Quantum computation is just a modern, tiny subset. The fluid mechanics approach to de Broglie waves is dead since 1920s and has no bearing on limitations of QIP.

Scott #50:

If the 3 points in your original post is what you have identified as “opponents’ best arguments”, I’d counter that these are not even arguments, just unsubstantiated statements. In my physicist’s view, their best argument is Brady’s explicit model, so I attacked it (for the same social reason as yours – people have asked me about it).

Comment #53 February 6th, 2013 at 6:17 am

Slava #52: Thanks for sharing!

If the 3 points in your original post is what you have identified as “opponents’ best arguments”…

No, I wasn’t suggesting anything of the kind! I’m not sure that I know how to identify the “best” arguments for why QC must be impossible, any more than I know how to identify the “best” arguments of the creationists or 9/11 truthers. On the other hand, in many previous posts on this blog (most recently here), I’ve addressed anti-QC arguments that at least weren’t in

directcontradiction with existing experiments, and that I found noticeably more informed and interesting than this one.Comment #54 February 6th, 2013 at 8:19 am

LOL … Scott’, isn’t it the case that post #50 knocks down a strawman that

no onehas ever advocated?To articulate the “Monarchy” critique (of #44) more completely, it commonly happens that FTQC enthusiasts defend the “House of Hilbert monarchy” by arguing that the sole alternative to the House of Hilbert is an nonviable anarchy of theories that are physically incomplete and/or mathematically immature and/or just plain wrong.

However, the text associated to Wikipedia’s amusing

gallery of European monarchssuggests a third alternative for the future of quantum dynamical STEM studies:When we reflect upon the evolution of the quantum dynamical literature, we perceive that during the early 20th century, the House of Hilbert Space reigned over science with all the

uniting vigor of Wilhelm I.Nowadays however, the STEM-wise influence of the House of Hilbert is greatly diminished, as practical dynamical computations increasingly employ mathematical frameworks that do

notextend naturally to unitary evolution upon state-spaces ofanydimensionality.In consequence, the role in science of today’s House of Hilbert is evolving to be less evocative of the intimidating authoritarianism of Kaiser Wilhelm I, and more evocative of the comforting and popular — but nowadays largely ceremonial — role of Beatrix, Queen of the Netherlands.

SummaryThe House of Hilbertformallyreigns over the STEM enterprise, but in practice doesn’t. FTQC enthusiasts envision the restoration of the House of Hilbert as an absolute STEM monarchy … yet that restoration is comparably as likely to happen, as Queen Beatrix of the Netherlands seizing the reins of power.ConclusionThe Aaronson $100,000 wager isfiscallysafe (both now and in the foreseeable future). Butoperationally, the Aaronson wager is lost already (largely at present and increasingly in the foreseeable future).Comment #55 February 6th, 2013 at 9:03 am

Lou #45, I disagree that general relativity is a “qualitatively different” explanation of the gravitational force than Newton’s theory. The full exact theory starts with more advanced, more geometric principles but when one focuses on the actual gravitational force in the contexts previously described by Newton’s theory, it’s strikingly obvious that general relativity is just a deformation of Newton’s theory. It may be reorganized as Newton’s theory plus corrections that go to zero in the “c to infinity” and “G to zero” double limit. That’s what I don’t call a qualitatively different explanation of the force.

On the other hand, things like refuting the very existence of the exponentially many states and their arbitrary superpositions *is* a qualitative denial of basics of quantum mechanics, it would be a qualitatively different theory. Of course that *within* quantum theory, one may deform existing theories – their Hamiltonians – by adding new fields or other degrees of freedom and new interaction terms to the Hamiltonian etc. But that’s not a deformation of the postulates of quantum mechanics that have to stay completely constant because any nonlinear or other deformation of the postulates of quantum mechanics would lead to a logically inconsistent theory, e.g. one in which P(A or B) isn’t equal to P(A)+P(B)-P(A and B).

Cheers

LM

Comment #56 February 6th, 2013 at 9:43 am

John Sidles #54:

The Aaronson $100,000 wager is

fiscallysafe (both now and in the foreseeable future). Butoperationally,the Aaronson wager is lost already (largely at present and increasingly in the foreseeable future).I don’t know what the hell that means. I’ll tell you what: you can have all of my “operational” money, if I can have just half of your “fiscal” money!

Comment #57 February 6th, 2013 at 9:48 am

Nex #49: thanks; this is exactly our intention – to see how far a classical model of QM can be taken. It was really surprising to get a decent model of the electron; what more can we do?

Ajit #51 and Slava #35: in the sonon model, two particles are entangled if their \chi waves are phase coherent.

Slava #52 and Lubos #47: You are right to say that a lot more work is needed before mainstream physicists will accept the sonon model as an explanation for QM. Lubos, you say we need to do 1000 times more work. For reference, Robert spent 50% of his time last year working on this and I spent perhaps 5%, so call it half a year. Spending 500 person-years on classical models of QM would cost $50m and would presumably need a DARPA BAA spread over a dozen universities for five years. I can’t see us making that sale just yet. Slava, I fully agree that Robert’s model is our best argument; you want us to extend it to cover the standard model, the exchange interaction, the gyromagnetic ratio, superconductivity and much else. Again, this is a lot to ask at this stage. But would you be prepared to take sonon theory more seriously if we came up with further non-trivial results, such as on superconductivity or the weak interaction?

Comment #58 February 6th, 2013 at 10:10 am

Slava #52:

>> “The deal breaker is that classical models, however beautiful, have no capacity…”

Yeah… I did appreciate that point though I didn’t jot it down explicitly. … But…

There are times when one doesn’t want to be hair-splitting.

It’s obvious that the moment you say: the classical, you immediately forgo: the quantum-mechanical. By definition. Sticking to the definitions, this part is very obvious. The point isn’t that.

The point is this: Even if someone puts forth a new view of QM that (ultimately mistakenly) is advertised as being “classical,” and even if it’s not a “complete” theory addressing all the postulated aspects of QM, but if this new view

actuallyhas sufficient departures from what the term classical strictly means and demands, to make it interesting, then: in passing judgments, should we be making appeals to definitions? I think not. … And, in fact, I think, in your post at your own blog, you, too, actually did not.So, the deal-breaker isn’t that they describe it as a “classical” model; the deal-breaker seems to me to be that their description is not (even near-sufficiently let alone “completely”) quantum mechanical.

I use the C++ program as a heuristic device. I mean it in the sense that: the basic argument has come from theory—a simulation cannot be a

substitutefor a theory. Yet, its enormous utility in ensuring “specific-ness” and “completeness” of description should be obvious.I mean, suppose that you already had this program for A+B’s theory (made available by them). Wouldn’t it then be so very easy to ask them to identify the line of code where they began dealing with, say, the entanglement (within a self-advertised “classical” framework)? Or, with the superpositions? Or, with the QM “collapse” (per the Copenhagen interpretation). That’s what I meant by “specific-ness.” Programs compell you to be specific.

As to “completeness,” a (good) simulation would help prevent all the futile discussion based on those afterthoughts—it would force the programmer to fully incorporate the QM aspects, simply because their absence would be so directly noticeable. I would expect a QM simulation to be capable of addressing at least the single-particle double-slit interference situation, if not also the Bell’s inqualities, the delayed choice eraser, etc. The double-slit interference would make for enough of “completeness,” in practice. Including a fairly comprehensive indication of the handling of the auxiliary data.

Of course, to be fully satisfactory, the program documentation would also have to show how its design and implementation fully addresses a decent postulatory description of QM, say as is found in any standard UG text on the mainstream QM (e.g. Eisberg & Resnick/Griffith/Gasiorowicz/etc). The modeling situation itself may be elementary and only an example (as in the double-slit interference). But the simulation has to show how the program at least implicitly

implementsfor one specific application case theentireset of the QM postulates—and in what sense.BTW, if it’s a new view of QM, it would have to differ in

somesense from the mainstream QM, just the way SR deviates in some ridiculously small but still in principle quantiable sense from the Newtonian mechanics, even at the “everyday” low speeds. The simulation will have to identify how—in what kind of limit its description converges to the mainstream postulatory QM (whichisan unsatisfactory and a broken description, IMO).Alright. May be, this reply has become too long and boring. Just wanted to jot down what I meant, what it is that I am usually looking for, and why.

* * *

Enough for now. Will check back tomorrow.

Ajit

[E&OE]

Comment #59 February 6th, 2013 at 11:13 am

Scott #43 I hope it is clear why we claim it is an order of magnitude harder to produce numbers greater than 16 using Shor’s algorithm. You suggest a quantum computation that is required to calculate the number 3. This would not be a contradiction because 3 is less than 16.

On the second part of your response (and thank you for your input Ajit #51)

1. As a graduate student I learnt many-body theory, and I am sure we share the experience. But your response seems to suggest you think there is something wrong with this theory. If so, what precisely is wrong with it?

2. It is the usual procedure in many-body theory to treat unexcited entangled items as a ground state. Do you disagree with this procedure?

3. The ground state acts as a single entity. Brian Josephson shows this explicitly for superconductors in the reference cited. Do you think there is something wrong with that?

4. A single entity can encode at most one qubit for the purpose of computation. What is wrong with that?

Comment #60 February 6th, 2013 at 11:40 am

this is exactly our intention – to see how far a classical model of QM can be taken.It can’t be taken past the Bell inequalities, I can tell you that.

It was really surprising to get a decent model of the electron; what more can we do?What more can you do? A decent model of

twoelectrons. That’s the hard part.Comment #61 February 6th, 2013 at 11:50 am

Robert Brady #59: I see, so

bothprime factors need to be greater than 16 in order to satisfy you. We should wait for the use of Shor’s algorithm to factor 323, then?Regarding your four-part syllogism, what’s wrong with it is that the only thing that

justifiescalculations that treat the ground state as a “single entity,” is the existence of a more fundamental theory, according to which there are actually far more degrees of freedom there than just one qubit (butthey’re behaving collectively). Treating a many-body approximation as if it gave you the fundamental degrees of freedom, whole ignoring the degrees of freedom of the very theory (QM) that the approximation rests on top of, is like building a house on air. You don’t get to do that without first laying more solid foundations.Comment #62 February 6th, 2013 at 11:59 am

Greg #60, Scott (introduction) and others: regarding Bell’s inequality, does section 5 of this paper provide the information you require? I am afraid it does assume familiarity with Cramer’s transactional model and with Mead’s model, and of why they are consistent with Bell’s inequality experiments.

Regarding the analogue of two entangled electrons, does Figure 5 of the same link, and the surrounding text, give you the information you require? If not I would be pleased to provide more.

Comment #63 February 6th, 2013 at 12:32 pm

Brady #62: Are you claiming that there’s a classical mechanism which can give results which violate the Bell inequality?

Comment #64 February 6th, 2013 at 12:49 pm

Scott #61 Yes, both prime factors need to be greater than 16 since it is necessary to exclude calculations that might be done with fewer computational qubits.

Have I understood you correctly? In the introduction to this blog, you comment approvingly on the Josephson effect. I don’t want to put words into your mouth, but you now seem to be saying that Brian’s original thesis ignores “the existence of a more fundamental theory, according to which there are actually far more degrees of freedom.”

Can you elaborate?

Comment #65 February 6th, 2013 at 12:56 pm

Robert – Of course it’s not satisfactory. Drawing a figure if two electrons is not the same as modeling two electrons with an equation. Listing a few citations to other people’s older models of quantum mechanics is also not the same as you giving a model of two electrons.

You’ve made a complete muddle of the issue of Bell inequality violations. This is really key, because the entire topic of quantum computing is a gargantuan extension of Bell violations. In one paper there is an indecipherable claim that a quite strict Bell violation experiment (reference [36]) has loopholes. In another paper there is a figure and there are citations, but there is no equation for two electrons that either allows or prohibits Bell violations. One of the citation is to Carver, whose model is non-local and explicitly allows superluminal Bell violations, thus contradicting the need to claim loopholes in the other paper.

Comment #66 February 6th, 2013 at 12:57 pm

Lubos #47: The difference with the other scientific theories you consider is that in the case of quantum computation there’s another very large piece of evidence – specifically, the extended church-turing thesis – arguing for the other side. (The thesis says that any reasonable model of physics can simulate any other reasonable model of physics with only polynomial slowdown.) Granted, the extended church-turing thesis is a more high level concept than a low level one, but it’s so extraordinarily robust that anything which violates it must be taken with extreme skepticism.

Of course, when two fundamental scientific theories contradict each other that’s fertile ground for coming up with and performing experiments which force the issue. The entire field of quantum gravity is dedicated to exactly this sort of program, and there the two theories don’t even contradict each other, just don’t merge nicely. It can be hard to even design the experiment in some cases, and Aaronson has done the best job so far of proposing such an experiment, which has in fact actually been done, and thus far quantum computation has held up admirably. I wouldn’t say that it’s scaled up enough that I’m convinced that it won’t run out of juice eventually – there are, for example, analog classical mechanisms which in principle are able to do arbitrary calculations and work okay at a small scale but break down when scaled up – but I view this line of research as extremely important.

Comment #67 February 6th, 2013 at 1:10 pm

Bram – Except that the polynomial Church-Turing thesis isn’t evidence, it’s theory. It’s a theory with accumulating evidence against.

Comment #68 February 6th, 2013 at 1:10 pm

Bram Cohen #63 Yes. Bell proposed experiments that test for either non-local or non-causal processes. Cramer’s transactional model exploits the non-causal element, which is also present in the solutions to Euler’s equation because it is time reversal symmetric. See section 5 of this paper for references and a discussion which is unfortunately rather brief and addressed to those familiar with these papers — perhaps it might be expanded upon.

From an aesthetic point of view it would be preferable not to have to use the time reversal symmetry. It might be possible to do this, as explored in the joint paper with Ross.

Comment #69 February 6th, 2013 at 1:29 pm

Robert #64: Fine, I’ll elaborate. If you accept QM, then the states created in these superconducting experiments are basically cat states, something like (|0⟩

^{n}+|1⟩^{n})/√2 where n is the number of electrons (which could be in the billions). Now, you point out correctly that, while the underlying Hilbert space might have 2^{billion}dimensions, the Hilbert space relevant tothese particularexperiments is merely a 2-dimensional subspace, the one spanned by |0⟩^{n}and |1⟩^{n}. And you therefore declare yourself satisfied that “only one qubit” is there.Unfortunately, that’s not even the start of the beginning of an acceptable answer. As Greg #65 stressed,

you don’t get to replace the precise predictions of QM by slippery verbal reasons-why-you’re-not-yet-proven-wrong that change from one experiment to the next.Instead, you need to replace QM by an alternate mathematical theory that(1)

alsodescribes anything that could possibly happen to a many-particle quantum system (not just one particular thing),(2) agrees with all experiments that have already been done, but

(3) unlike QM, does

notrequire an exponentially-large Hilbert space.The reason many people here are getting exasperated with you is that you seem to have no inkling of what would

actuallybe involved in constructing such an alternate theory.Comment #70 February 6th, 2013 at 2:26 pm

In multiple comments,

Shtetl Optimizedreaders have expressed skepticism that the mathematical framework of the Anderson/Brady preprint is adequate to build a viable theory of quantum dynamics upon it (and personally I share that skepticism).And yet, reasonable grounds exist to extend that same mathematically-grounded skepticism to orthodox FTQC. As was previously noted:

Hmmm … to address Scott’s concerns, let’s make explicit the metaphorical argument (of #54), by focusing not upon “mere” money, but rather upon rational investments of every researcher’s most precious resource (students especially): time, attention, and imagination!

Commonly students learn undergraduate-level quantum mechanics from texts that include (everyone has their own favorite list), Feynman’s

Lectures, Dirac’sPrinciples of Quantum Mechanics, Gottfried’sQuantum Mechanics, Landau’sQuantum Mechanics. Nonrelativistic Theory, and Nielsen and Chuang’sQuantum Computation and Quantum Information(there are hundreds more).To pursue serious research, starting at the graduate level, an entirely new set of textbooks enters the picture, that include (again, everyone has their own favorites, and the following texts all are highly-ranked Amazon.com best-sellers) Spivak’s

Calculus on Manifolds: a Modern Approach to Classical Theorems of Advanced Calculus, Frankel’sThe Geometry of Physics, Nakahara’sGeometry, Topology and Physics, Lee’sIntroduction to Smooth Manifolds, Nash’sTopology and Geometry for Physicists, and Zee’sQuantum Field Theory in a Nutshell.What is striking about this second list is how

sparsereferences are to Hilbert space (in the narrow sense of Feynman/Dirac etc.). Slowly the realization dawns at the graduate level: “I’ve studied these texts in the wrong order! It’s better to study Spivak first, not last!” And indeed this modern mathematical sentiment is vigorously espoused by Amazon reviewers:When quantum dynamics is appreciated through the lens of the second reading-list, it become apparent that rather little (if any?) of modern quantum research depends essentially upon the absolute existence of an exponential-dimension Hilbert/Dirac state-space. It is necessary only that the dynamical state-space be

effectivelyHilbert/Dirac … and the set of dynamical manifolds having this property is vast (as the second reading-list is at pains to inform students).In view of this burgeoning literature, and the increasing desirability (for students) of mastering the requisite mathematics

beforetackling the quantum physics, it is not unreasonable to foresee that (to paraphrase Einstein):Indeed, the literature of the most recent decade amply documents that the 21st century’s inexorable supplanting from center-stage of the 20th century’s cherished Hilbert space

alreadyis well underway … and thus in the decades to come, a return of (our still-cherished) Hilbert space to center-stage of quantum dynamical research is about as plausible — and about as desirable too! — as (the still-cherished) Queen Beatrix reigning as absolute monarch of the Netherlands.Comment #71 February 6th, 2013 at 2:33 pm

Ross #57:

No. Your use of \chi wave corresponds to treating the 2 sonons as bosons in the same quantum state – a coherent two-particle condensate which is a product state with no entanglement.

Forget about about all the complex stuff. Just show the community how two electrons form a singlet and a triplet. No big deal, just a little element towards a minimally realistic approximation of helium / positronium.

Unfortunately, there is no way you or anybody else can make it with a single-fluid hydrodynamic model. One \chi for all not what many-body quantum physics is about.

And is Robert’s model is your best argument, then there no argument.

Comment #72 February 6th, 2013 at 2:45 pm

BTW, as Joe has rightly pointed out on my blog, there is no way to tell whether sonons are bosons or fermions. End of story.

Comment #73 February 6th, 2013 at 3:16 pm

Many – so many comments, particularly on spin symmetry, entanglement, and Bell’s inequality, to which I will respond shortly.

Scott #69 Thank you! I now understand. I was indeed referring to what you call the “underlying” theory beneath the measured entangled states. I had wrongly assumed you were too.

The “underlying” theory is described in Brian Josephson’s thesis. On page 18 he introduces the operator exp(i N \theta) which connects states with N and N-2 electrons in a bulk superconductor. Its value is S = exp(2 i \theta) where \theta turns out to be the phase observed in a Josephson junction. In this way, Brian reduces the large number of electrons down to a single parameter – the phase – which is measured in a Josephson junction. This is the parameter which underlies the correlated phenomena you refer to.

At this underlying (Josephson) level, the phase can code for at most one computational bit, even though it encompasses a very large number of electrons in the ground state.

Are we moving towards agreement, at least on this issue? If so, that would be some progress on the primary subject of this blog!

Comment #74 February 6th, 2013 at 3:40 pm

Robert #73: No, we are not moving toward agreement. You keep talking about the effective theory of one specific collective phenomenon, and I keep trying to get you to focus on the only general theory we know (QM) from which that effective theory can be derived — a theory that implies the existence of vastly more degrees of freedom in the system, which could be probed by some other experiment if not by this specific one. At this point, I basically throw up my hands: I’ve explained it to you as clearly as I know how. Maybe someone else can take a crack at explaining it.

Comment #75 February 6th, 2013 at 4:19 pm

Robert Brady #73, in the fifty years since the 1962 Josephson thesis that you cite, a tremendous amount has been learned regarding antisymmetrized quantum dynamical state-spaces (per arXiv:math/0005202, arXiv:math/0208166, and arXiv:1110.6367 for example).

Without in the least laying claim to an expert level of personal expertise, it seems (to me) reasonable to anticipate that substantial advances in our physical understanding of quantum dynamics will at least refer to these substantial (and ongoing) advances in our mathematical understanding of quantum dynamical state-spaces (in all of their variously symmetric/antisymmetric/asymmetric varieties).

Perhaps

Shtetl Optimizedreaders can identify fundamental advances in quantum physics understanding that werenotaccompanied by and/or authorized by and/or catalytic agents of fundamental advances in mathematical understanding?Comment #76 February 6th, 2013 at 5:45 pm

Slava Kashcheyevs #71 #72

Greg Kuperberg #65

and others interested in spin symmetry and the standard model.

Much more could indeed be done on particle symmetries. This is what we know for the R11 sonon.

(a) It has spin-half (Fermi) symmetry, as can be seen from the Pauli matrices after equation 8 here.

(b) Their lowest energy state has spins opposed, as can be seen from the interaction energy (equation 9) and the symmetry of the spherical Bessel function discussed thereafter.

The R10 sonon has a lower order symmetry (since n=0), and the higher families of sonons presumably have more complex symmetries. The details might be an area for further investigation.

Comment #77 February 6th, 2013 at 6:36 pm

Discussion of Blogs, Comments, etc, from Sci Am:

http://blogs.scientificamerican.com/a-blog-around-the-clock/2013/01/28/commenting-threads-good-bad-or-not-at-all/?WT_mc_id=SA_WR_20130206

SO fits into this picture.

Comment #78 February 6th, 2013 at 7:27 pm

Robert – I am not all that super interested in spin symmetry or the standard model. All I said was that you don’t have a credible discussion of Bell inequality violations. Some discussion, yes, but nothing that hangs together at all.

Comment #79 February 6th, 2013 at 7:32 pm

Brady #68: What section 5 makes clear is that you’re clearly proposing a classical system. The problem then is that Bell’s theorem isn’t a guideline or principle, it’s a theorem, so no amount of you pointing to complex mathematical machinery in references is going to get people to read them, because you might as well be claiming to have found a way to trisect an angle with a compass and straightedge.

Comment #80 February 6th, 2013 at 7:34 pm

Hey, everyone, it’s hopeless. You are not going to convince Brady of anything.He obviously has no understanding of the vast range of physical phenomena that cannot be explained without quantum mechanics, and how well tested and tightly constrained our current physical models actually are.

Comment #81 February 6th, 2013 at 7:41 pm

Greg #67: You could say that the second law of thermodynamics is ‘just a theory’ as well, that doesn’t stop people who claim to have violations of it from rightfully being dismissed as cranks out of hand. Granted the second law seems *more* fundamental than ECT, but that’s like saying helium is less common in the universe than hydrogen. Note that I’m not dismissing the evidence against ECT, just expressing far more skepticism that QM will continue to hold up as the experiments progress than some others have.

Comment #82 February 6th, 2013 at 9:08 pm

I just read this paper and I found it quite amazing:

First there is the reference to a video clip with Morgan Freeman (quite interesting how the drops bounce around on the vibrating plate), then they mention deBorglie-Bohm, an interpretation which is completely equivalent to Copenhagen (and others), at least in the non-relativistic case, and finally they present a ‘soliton model’ of the electron, which reminds me of Lord Kelvin’s vortices in the ether proposal of the 19th century (but I think L.K. made more sense).

This is then used to make an argument about quantum computers.

If this is physics in the 21st century then I want the 20th century back…

Comment #83 February 6th, 2013 at 10:25 pm

wolfgang #82:

If this is physics in the 21st century then I want the 20th century back…

No, this isn’t “physics in the 21st century”—it’s just two guys trying to

overturnmodern physics, far from the first or the last. Of course there’s a selection effect; stuff that’s actually representative of “physics in the 21st century” is less likely to lead to emails asking me for comment or to an annoyed blog post like this one.Comment #84 February 6th, 2013 at 11:19 pm

Bram – At this point the second law of thermodynamics is almost a mathematical theorem rather than a separate physical theory. That’s different, that’s something that you would already believe even if it weren’t tested. In any case it has been confirmed many times, except in regimes (such as cosmology) where it doesn’t apply without modification.

There is no theorem supporting the polynomial Church-Turing thesis within the current laws of physics. On the contrary, quantum probability is true and within quantum probability, the polynomial Church-Turing thesis is close to disproven rather than proven. So you shouldn’t believe the polynomial Church-Turing thesis, for the same reasons that you should believe the second law of thermodynamics.

Comment #85 February 7th, 2013 at 12:33 am

Robert Brady #76

Sure, I have understood from your paper the claim that a sonon has spin 1/2 and two sonons couple anti-ferromagnetically. This does not bring you any closer to construct a signlet or demonstrate the exchange symmetry (boson/fermion/anyon).

Comment #86 February 7th, 2013 at 3:29 am

John #75 Good comment. See the 50 year update conference.

Scott #74 Oh yes we do agree! Fortunately, you do not need to rewrite Brian’s thesis in order to analyse quantum computing using Josephson junctions. If the Josephson phase changes by 2 \pi around a loop, this is called a flux quantum. Tunnelling of flux quanta was first observed by John Clarke (see conference link above). In fact, these quanta appear to behave like individual quantum mechanical entities. The analysis you describe can be applied to them and the usual results of quantum computing follow.

Each individual flux quantum is a collective phenomenon. I hope you will agree it does not contain millions upon millions of computational qubits — unless you think Josephson’s thesis doesn’t apply to these experiments, in which case please specify!

Comment #87 February 7th, 2013 at 6:18 am

Bram Cohen #79 I understand your question. Let me describe why the motion of sonons is consistent with Bell’s analysis.

The coherent motion of sonons at low velocity obeys equation 11, and its trajectory obeys equations 12 and 13. Equation 11 is the same as the Schrodinger equation, and the probability of a trajectory reaching (x, t) is | \psi(x,t) |^2 (which follows from (12) and (13) — the paper reproduces Bohm’s reasoning). These are the same equations on which Bell’s analysis is based, and therefore it would be surprising if the motion of sonons were inconsistent with it. I do not think this is controversial, but please tell me if it is not clear.

I think the debate is about how to interpret the consequences, which seem to be counter-intuitive. Cramer’s transactional interpretation of quantum mechanics is only one of the possible ways in the literature to interpret it; it was not intended to be a proof.

Comment #88 February 7th, 2013 at 6:27 am

Slava Kashcheyevs #85 There is obviously a lot of detail required in these areas in order to satisfy you.

I think your question, or at least similar questions that are relevant to particle physicists, is answered in the extensive literature on the subject. As you will know, the same compressible inviscid fluid is studied in the field of analogue gravity — see Barcelo’s review article. See in particular Volovik’s book regarding particle symmetries and the standard model.

Happy reading!

Comment #89 February 7th, 2013 at 6:56 am

Wolfgang #82 Thank you. There are some similarities, as you suggest. However, sonons are irrotational, unlike vortex atoms. You may want to look at the online talks at the recent conference on tightly knotted and linked systems, which in some respects are the successors to Lord Kelvin’s vortex atoms.

Comment #90 February 7th, 2013 at 8:32 am

Lubos #55,

You say: ‘I disagree that general relativity is a “qualitatively different” explanation of the gravitational force than Newton’s theory.’

Of course you are entitled to your own opinion of ‘how different’ two theories are, but I suspect you are the only person on the planet with this view. Newton’s is an action-at-a-distance theory in a flat space-time with no mechanism. GR provides the equivalent of Newtonian forces via shortest paths in curved space-time. It was so weird at the time that only a few people even understood how it might work. For example, Planck said, of combining gravity with SR:

“As an older friend, I must advise you against it, for, in the ﬁrst place you will not succeed, and even if you succeed, no one will believe you.”

Herman Weyl, another rather sharp guy, said of GR:

“It is as if a wall which separated us from the truth has collapsed. Wider expanses and greater depths are now exposed to the searching eye of knowledge, regions of which we had not even a pre-sentiment.”

A typical (I believe) modern view is that of Ashtekar:

“Space-time is not an inert entity. It acts on matter and can be acted upon. [...] There are no longer any spectators in the cosmic dance, nor a backdrop on which things happen. The stage itself joins the troupe of actors. This is a profound paradigm shift [that]… shook the very foundations of natural philosophy. It has taken decades for physicists to come to grips with the numerous ramiﬁcations of this shift and philosophers to come to terms with the new vision of reality that grew out of it.”

I’d be very surprised, but impressed and interested, if you can find similar support for your proposition the GR and Newtonian gravity are qualitatively similar.

Comment #91 February 7th, 2013 at 9:24 am

The word “almost” is smile-inducing because it calls to mind so much STEM history:

• The Earth is

almostflat.• Malaria is

almostinvariably associated to the bad night air of swampy regions.• The Parallels Postulate is

almostself-evident (and/oralmosta mathematical theorem).• Planck’s radiation law

almostfollows from Boltzmannian statistical mechanics.Recent work such as — to cite one article of many — Derezinski, De Roeck, and Maes “Fluctuations of quantum currents and unravelings of master equations” (2007, arXiv:cond-mat/0703594) is exemplary of contemporary efforts to close the “almost” gap to which Greg Kuperberg’s comment #84 refers. The Anderson/Brady preprint is (as it seems to me) relatively less sophisticated, less successful, and therefore (arguably) less promising in regard to further progress.

Comment #92 February 7th, 2013 at 10:42 am

Brady #87: That is not at all clear. Could you answer simply whether your model allows non-local phenomena?

Comment #93 February 7th, 2013 at 10:54 am

I think this thread has by now got numerous high-quality comments.

At this point, I have to raise a few questions:

To A+B’s critics/detractors:

Is the real point of contention the very idea that what A+B propose is claimed to be a

classicalmodel?To A+B:

1. You make reference to Cramer’s interpretation. As the nine formulations paper states, Cramer’s interpretation

quantitatively“makes no predictions that differ from those of conventional quantum mechanics.”In the mainstream QM, there is instantaneous action-at-a-distance (IAD)—i.e., IAD, as distinguished from mere entanglement. For instance, in the Copenhagen interpretation, the wave-fuction collapse requires IAD. Inasmuch as your theory produces results that are quantitatively

identicalto the mainstream QM theory, your theory also involves/entails IAD. Am I correct?2. Why must the fluid be compressible? Does it have a deep but not very obvious relevance in imparting the specifically quantum-mechanical character to your theory?

Finally, to wrap up, here’s a suggestion to A+B:

I have touched upon this point above, but wish to highlight it again, separately. I think it would help your cause if you explicitly establish how your theoretical constructs correspond to or lead to the postulates of the mainstream QM, esp. the nonrelativistic QM. In deference to the 80/20 rule, personally, I would suggest writing an article that’s accessible to someone who hasn’t read anything beyond the first half of Quantum

Chemistryby McQuarrie. More sophisticated accounts could then address the remaining 20% (or even just 2%!) of the objections/queries.Ajit

[E&OE]

Comment #94 February 7th, 2013 at 11:21 am

Equation 11 is the same as the Schrodinger equation.. These are the same equations on which Bell’s analysis is based… I do not think this is controversial, but please tell me if it is not clear.Actually, it’s beyond controversial. Equation 11 is the *single-particle* Schrödinger equation. In order to violate Bell’s inequalities, you need the *multi-particle* Schrödinger equation. If you don’t have that, then the entire discussion is nonsense.

Besides, you clearly are arguing in the alternative. In your comments here you accept Bell violations, but in your other paper you dismiss them as the result of loopholes.

Comment #95 February 7th, 2013 at 11:28 am

Robert #88

It may look like very complicated and requiring “a lot of details” to you, but there is hardly more elementary exercize in two-particle quantum mechanics than constructing and classifying symmetric and anti-symmetric states. This where sonons fail hopelessly.

My point was not request you re-do all of physics, but to point out an overwhelming amount of evidence contradicting your model .

But I’ve said enough. I’m not challenging anything (except the relevance of your sonon model to real particles), so the burden of proof is not on me. Have fun.

Comment #96 February 7th, 2013 at 11:48 am

Lou #90, I am not silly so be sure that I know all the differences between GR and Newton’s theory, too. But my point is that GR doesn’t restrict the data describing Newton’s gravitational forces. On the contrary, it adds some new degrees of freedom – the metric tensor which may sustain gravitational waves even in the absence of sources – which is made necessary by the fact that the force in GR has to obey the cosmic speed limit, the speed of light.

But what is discussed here is a qualitatively different theory that would *steal* something from quantum mechanics. Clearly, the tensor-product-like exponentially growing Hilbert space (with all the complex linear superpositions allowed) seems too large and complicated to the authors discussed in this thread. So they want something “simpler” really in the sense that it subtracts the number of possible states.

My comments about GR’s being a deformation of Newton’s theory were just an example of my broader claim that there doesn’t exist a single precedent in physics in which a working theory would be superseded by a qualitatively different one that reduces the “space of states” of the older theory. This just won’t happen. Using the words relevant here, it won’t happen that quantum computers will be made impossible because a hypothetical better, future theory will prohibit the entanglement or arbitrary superpositions with many qubits.

Such a hypothetical evolution is indefensible, contradicts all known laws of quantum mechanics, has no historical precedent, and is only motivated by certain people’s limited intellectual abilities because these people just find QM too complicated and its Hilbert space too large. But it will never get smaller.

Comment #97 February 7th, 2013 at 12:02 pm

Gred #94

“If you don’t have that, then the entire discussion is nonsense.”

Yes, they don’t have that (but do not seem to realize it) and the entire discussion is nonsense.

Comment #98 February 7th, 2013 at 1:11 pm

Bram Cohen #92 Yes. As you would expect for a model consistent with Bell, the sonon energy is delocalised and the processes are not necessarily causal. See for example the delocalisation of the energy in the spin-correlated |ud> + |du> state in #14 here.

Comment #99 February 7th, 2013 at 1:36 pm

Greg #94: It’s even worse than you say. NO form of the Schrodinger equation (single particle, multi-partilce, whatever) is needed to show that QM violates the Bell inequalities. (The only thing one needs to say about time evolution is that the spin state does not change as the particles fly to the detectors.) Rather, it’s the general structure of the spin state, together with the Born rule, that lead to the violation of the Bell inequalities. What Bell showed is that this general structure of the spin state CANNOT be reproduced by ANY theory of classical probabilities that does not have instantaneous action-at-a-distance.

In 1985, David Mermin wrote a fantastic article for Physics Today, “Is the moon there when nobody looks?”, which gives a beautifully clear explanation of Bell. It’s behind the paywall at Physics Today, but a google search turns up several places where it is freely available.

Comment #100 February 7th, 2013 at 1:39 pm

And, in many experimental tests of Bell, it’s PHOTONS, not ELECTRONS, that are used.

Comment #101 February 7th, 2013 at 1:50 pm

The assertion is incorrect: the four-letter “GCAT” hereditary state-space of DNA — as broadly foreseen by von Neumann in a 1946 letter to Norbert Weiner — is a tight restriction of the (foggily envisioned) larger-dimension protein-template hereditary state-space that was espoused in the 1930s and 40s by luminaries like Linus Pauling.

Comment #102 February 7th, 2013 at 1:51 pm

Luboš Motl #96

“Using the words relevant here, it won’t happen that quantum computers will be made impossible because a hypothetical better, future theory will prohibit the entanglement or arbitrary superpositions with many qubits.”

I like the idea to do a trick like Eric Verlinde did with gravity. Making the “old” theory indeed a statistical average (coarse graining) of the “new” theory. This not only incorporates entanglement. The holographic principle demands massive amounts of entanglement.

But I like that approach better mostly because my (lack of) math skills make it impossible for me to understand (super) string theory. Some thermodynamics I do understand.

Comment #103 February 7th, 2013 at 2:15 pm

Ajit #93 Thank you.

Yes, the emergence of quantum motion from completely classical motion might well seem unintuitive, even after you have seen the videos of Couder’s experiments.

No IAD – Instantaneous action at a distance is impossible in Couder’s experiments, even though they faithfully reproduce tunnelling, double-slit diffraction etc. Likewise, on our model, the probability a trajectory passes through (x, t) is just |\psi(x,t)|^2 and so there is no need for any wavefunction collapse, instantaneous or otherwise.

The fluid must be compressible for the very ordinary reason that the speed of sound is theoretically infinite in an incompressible fluid.

Thanks for the suggestion of a simple paper.

Comment #104 February 7th, 2013 at 2:40 pm

Greg #94 and Slave #95 Many thanks. I accept I did not provide an explicit spin superposition and show how it is measured.

This is now here for the |ud> + |du> spin states of two R11 sonons. I hope it is clear how to do the others from this example.

I am afraid R11 sonons are spin-half and we are not ready to publish with spin-1.

Comment #105 February 7th, 2013 at 2:48 pm

In #103 we see that Dr. Brady (as predicted) still

just doesn’t get it. (Trying html tags, let’s see if they work.) Reproducingsomeaspects of quantum phenomena with a local classical model is certainly possible, but reproducingallaspects is certainly not. That’s what Bell proved. But it was strongly suspected long before Bell, since all attempts to construct such a model had failed.Comment #106 February 7th, 2013 at 3:43 pm

Anonymous #99 – Certainly the two-particle Schrödinger equation, with or without spin, does violate Bell’s inequality and other Bell-type inequalities. And certainly one single particle cannot violate any such inequalities in any straightforward way. It also doesn’t matter whether you use photons or electrons. Bell violations are a pervasive phenomenon of quantum probability as it applies to almost any type of joint quantum state.

Comment #107 February 7th, 2013 at 4:27 pm

Greg #106: Actually, you can perfectly well violate a Bell inequality with just a single particle, in the “entangled” occupation-number state |0⟩|1⟩+|1⟩|0⟩! It requires a more subtle measurement, but apparently it’s even been demonstrated experimentally. This is a point that I was long confused about myself, but see for example this delightful paper by van Enk.

Comment #108 February 7th, 2013 at 4:38 pm

On further reflection, Luboš Motl’s comment (#96) provides us with that valuable entity, a Great Truth (namely, a Truth whose opposite also is a Truth):

In addition to the genetic example (of #101), we have also:

•

A mathematical exampleThe restriction of elliptic curves to finite fields yields (along with much elegant mathematics)elliptic curve cryptography.•

A condensed matter/field theory exampleKen Wilson’srenormalization groupmethod systematically replaces (many) microscopic degrees of freedom with (fewer) macroscopic degrees of freedom, so as to usefully make physical sense of (i) phase transitions and (ii) the divergences of field theory.Thus we appreciate the dual aspects of …

Comment #109 February 7th, 2013 at 6:03 pm

Greg #99: We appear to have a semantics problem. By “the Schrodinger equation”, I meant just the time-evolution equation of QM. Violations of Bell in QM are not dependent on how the system evolves in time. As best I can tell, what you mean by “the Schrodinger equation” includes the entire superstructure of QM (states, observables, Born rule, etc.). I was trying to distinguish this superstructure from the Schrodinger equation itself.

Comment #110 February 7th, 2013 at 6:05 pm

Oops, I meant Greg #106.

Comment #111 February 7th, 2013 at 6:56 pm

Scott – Almost, but I don’t think that you actually can. You can certainly create that state which is entangled in an occupation number basis. However, you will only see non-locality if you apply measurements that have a chance of creating a second particle. So, no dice I think.

In any case, certainly the actual Bell violations are done with two particles in an entangled state, even if you could in principle measure two boxes with entangled occupation.

Comment #112 February 7th, 2013 at 7:51 pm

Greg #111: This paper by Babichev, Appel, and Lvovsky claims to have actually achieved an experimental Bell violation (subject to the usual detection loophole), using a single delocalized photon as the sole entangled resource. (Yes, I’m sure the measurements involve additional particles both on Alice’s end and on Bob’s end, but so what?)

Comment #113 February 7th, 2013 at 11:26 pm

Scott – This is interesting enough that I must keep quiet until I understand it better.

Comment #114 February 8th, 2013 at 1:01 am

Robert #103:

Oh, you are welcome! But…

Let me wrap up, somewhat at a length. (I will sure check back for comments and all, but as far as I am concerned, the wrap up for this thread seems to be fast approaching.)

1. I do think that a part of the problem lies with the way you (+Ross) have written the paper—it covers too much of territory, too fast.

For an astonishing prediction setting concrete limits on the number of coherent cubits, prior discussion is so sparse as to be almost absent. I was interested in the 3D case, and so did a word search on “four” in your A+B paper. The only places it appears is in the abstract and conclusions! That’s rather like the Copenhagen quantum—it’s there only when measured, at emission (abstract) and absorption (conclusion). … I also dare suggest that you once again check your logic. Chances are very extremely bright^{bright} that the result holds only under a restricted set of auxiliary conditions.

2. BTW, you said (#103) no IAD, but you still didn’t quite directly clarify if your theory makes predictions that are quantitatively identical to those of the mainstream QM theory, or not. (By mainstream QM, I mean any of the nine+ interpretations or treatments in (students+Dan Styer)’s paper.)

The reason I insist on this part is that I myself have had a preliminary (conference) paper on a new approach to QM (of only photons, so far); my approach in principle leads to a quantitatively different prediction (though I don’t know except in broad outlines how to work out its detailed maths).

3. Coming back to your research: A simpler paper is certainly needed, but also a paper that at least addresses

allthe stages of the quantum evolution in a simple example case, if not also presenting a working C++ simulation for it.4. Also, I would suggest: In that paper, please make a clean break from Couder et al’s work. It simply confuses people.

It’s obvious that Couder’s work does not reproduce

allaspects of QM. Even if we assume a simplest model of the universe consisting of just electrons + photons, if the dancing droplets are taken to be electrons, it’s obvious that the since the waves induced by the droplets are the force-carriers, in this model, they should represent photons. However, in the Couder model, such “photons” are not quantized—they are not localized in space, as the real photons in a single-photon-at-a-time diffraction experiment would show. Naturally, the Couder model is insufficient in terms of how much of a quantum character it can fake. (And that is apart from the very simple question that had struck me as soon as I read about it the very first time around 2010, the same time it got covered in the MIT News: Who/what vibrates the universe? especially in the 3D? My other question was: In 3D, how precisely does a droplet induce waves?)Now, yours is a different model. It is a “purely” mathematical model, not fully realizable in a

classicalexperiment. The classical fluid isn’t inviscid. Qua a mathematical model, it would be possible to overcome the limitations of the Couder model in it. If so, why make a reference to the Couder model at all?5. Finally, I sense that I might have other issues about your sonon model. I mean some deeply physical issues (not mathematical); .e.g., things like: the existence of the singularity at the sonon surface, i.e. the very existince of a sharp boundary surface. The Aristotlean law of the excluded middle entails that a

physicaltheory cannot carry singularities; they can only be projected (i.e. imagined) mathematical entities/features, without any physical existence. Or, as Roger Schlafly’s blog highlights: “natura non facit saltus.”And, I would seek a detailed picture of the

interactionof an “electron-type” (i.e. the Fermi/matter particle) sonon with a “photon-type” (i.e. the Bose/force-carrier) sonon—including, whether, and if yes, precisely how an electron-sonon absorbs a photon-sonon, what the pair physically looks like after the absorption; what makes the electron-sonon emit the photon-sonon, etc.6. To (finally!) wind up:

If in theory you take a clean departure off Couder’s model (it can continue to be a part of a motivation section but little more), supply the correspondence with the postulates, and then if you could also supply a comprehensive account (ideally, with a C++ program) of an elementary but complete case (e.g. double-slit diffraction), apart from addressing issues like the above, I would be very, very happy to read it. And, I am sure, many others, would, too. So, kindly keep us posted.

Best,

Ajit

[E&OE]

Comment #115 February 8th, 2013 at 1:09 am

Luboš #96:

My comments about GR’s being a deformation of Newton’s theory were just an example of my broader claim that there doesn’t exist a single precedent in physics in which a working theory would be superseded by a qualitatively different one that reduces the “space of states” of the older theory. This just won’t happen.

Why isn’t the holographic principle, which reduces the naïvely infinite-dimensional Hilbert space of QFT to the finite-dimensional Hilbert space of quantum gravity, a counterexample to the above claim?

Comment #116 February 8th, 2013 at 1:48 am

Dear Scott #115,

the holographic principle isn’t an example because, as you correctly said, the Hilbert space of quantum gravity is infinite-dimensional only “naively”, not according to a working theory.

There hasn’t ever been a working, internally consistent theory of quantum gravity that would have an infinite-dimensional Hilbert space even for finite regions. This is an inconsistent assumption and this situation is different from quantum mechanics of 6 qubits which is an internally consistent – and experimentally tested – theory in physics.

Of course that from some broader viewpoint, namely if you allow some somewhat inconsistent theories to the mix, the holographic principle *is* an example of exactly what I say has no examples. But as I already discussed above, it’s an example that happened and could happen only in some very extreme regime and it had physical consequences.

The claims of the type “quantum computers aren’t allowed by the right laws of physics” are, on the contrary, claims about a completely non-extreme, low-energy physics that has been tested indefinitely so one can’t find any meaningful inequality that would separate the regime in which QM works as tested and in which it would be replaced by a “smaller” theory.

Cheers

LM

Comment #117 February 8th, 2013 at 6:02 am

One can observe violations of the mathematical inequality we call the Bell inequality with single particles – and this can be used to furnish a single particle QKD scheme (the correlations existing between state preparation and measurement). The eavesdropping test then amounting to determining whether or not the inequality is violated.

I think it was Boole who showed that if we have 3 random variables A, B and C, then the joint probability P(A,B,C) that correctly reproduces the marginals P(A,B) etc can only be constructed if the marginals satisfy what we call the Bell inequality today.

Any proposed classical model of entanglement must therefore be able to reproduce this ‘non-existence property’ for P(A,B,C) for certain choices of A, B, and C and that’s before we add non-locality into the mix.

I was surprised by the A + B paper – Ross is a well-known, and respected, figure in the security community and he’s done some really cool stuff. Readers of this blog will understand what I mean when I implore Ross not to become a Christian!

Comment #118 February 8th, 2013 at 7:39 am

Lubos #96 says:

“There doesn’t exist a single precedent in physics in which a working theory would be superseded by a qualitatively different one that reduces the “space of states” of the older theory.”

Even special relativity and QM could be counterexamples. In Newtonian mechanics, the velocity could be anything; in SR it’s limited to the subset less than the speed of light. In classical mechanics, a harmonic oscillator can have any energy, but in QM only a subset are available, reducing the state space from an uncountable infinity to a countable one.

Comment #119 February 8th, 2013 at 8:18 am

Scott #115

I don’t want to comment on the merits of the Anderson-Brady paper in a site dedicated to its perfunctory dismissal. Where the philosophy of science is concerned, however, Lubos has quite valid arguments.

Even though the holographic principle is not a true theory, it would not counterexample his claim. Just as relativity subsumes Newtonian physics in the limit, holography constrains the physics of quantum mechanics to the limit of physical manifolds. In this respect, at least, it extends relativity in the same context that relativity extended Newtonian mechanics.

Point is, as Lubos implies, there is a backward-forward relation between all physical theories and principles; none exist in a thought-vacuum. This may be different way of saying that there is something rather than nothing, but it isn’t trivial.

(Hope you can stay out of the weather today in Boston.)

Tom

Comment #120 February 8th, 2013 at 10:17 am

Dear Lou #118,

nope, the same problem with the range of validity affects your other examples, too. One may define the validity of classical physics to be one in which the speeds at smaller than a limit, c, and validity of classical physics in regimes in which the angular momentum or action or (delta x)*(delta p) are much greater than Planck’s constant.

The point is that these old theories weren’t established as theories for all regimes, however extreme they are. So non-relativistic mechanics fails at high velocities, classical physics fails at tiny angular momentum, actions, or attempted tiny uncertainties of momentum and position, the local theories of gravity fail when one tries to compress too much information (like black hole entropy) into a small region.

But the new theories always confirm the state space in the relevant approximation. That’s different than the claim here because 5 qubits isn’t extreme in any sense, yet those folks want to claim that basic QM becomes invalid. There isn’t any quantity such as speed, angular momentum, action, products of uncertainties, entropy density per area or anything else that would be extreme in mundane low-energy systems with 5 qubits, so if one claims that QM is wrong, he’s claiming that it’s giving totally wrong predictions everywhere which it clearly doesn’t.

Don’t tell me you don’t understand what I am saying. In all your conventional examples, the newer theory almost perfectly confirms the older theory’s description of all mundane experiments one may do in the labs. This case is claimed to be different because even the mundane things are claimed to be wrong in the old theory – QM.

Cheers

LM

Comment #121 February 8th, 2013 at 10:57 am

Luboš Motl proposes a another Great Truth:

LOL we appreciate the duality of Luboš’ Great Truth when we reflect (as one example) upon the persistent confusion and controversy — both experimental and theoretical — regarding

the quantum Third Law, and in particularquant-ph/0703152illustrates how subtle these issues can be.When relativistic gauge field theory enters (as it always does in designing practical experiments) the “non-extreme, low-energy physics” becomes even more subtle. Example, what obstructions have (so far) prevented the theoretical literature from reliably assessing the feasibility of scalable Aaronson/Arkhipov

n-photon source/detector systems?ConclusionIntroductory quantum texts — like Feynman’sLecturesand Nielsen and Chuang’sQuantum Computation and Quantum Informationfor example — commonly skirt certain “completely non-extreme, low-energy physics” theoretical issues … but 21st century experimentalists and engineers are not permitted this luxury!That is why numerous creative & insightful articles are continuing to extend our still-immature understanding of these “completely non-extreme” quantum physics topics. A long journey toward understanding awaits us … which is good!

Comment #122 February 8th, 2013 at 11:52 am

Lubos #120,

There are two different statements here. The first is that a new theory is bogus if it cannot reproduce the well-known results from the previous theory. On this we agree completely.

The second is that you can tell that a theory is bogus if it reduces the state space of the previous theory. This I do not believe since it is entirely possible that the problem with the old theory is that the state space was too big (bigger than reality). This is exactly what happened with QM – the old theory, with the old state space, gave results that contradicted experiment (there was no ultraviolet catastrophe, and atoms did not radiate until they collapsed). By reducing the state space to quantized values these problems were fixed. Importantly, this restriction did not screw up previous well-verified results using macroscopic objects, which were shown to be a limit of the new theory.

I am in no way defending the new QM theory discussed here – we both agree it’s bogus. However, it’s not bogus just because the new state space is smaller – it’s bogus because it contradicts existing experiments.

Comment #123 February 8th, 2013 at 12:39 pm

Dear Lou #122,

I believe that I have already clarified the statement that one cannot reduce the state space of the previous theory. I am talking about the state space for a particular situation – such as an 8-qubit experiment in a low-energy lab considered here.

In all the historical examples, the space of states was preserved and/or “infinitesimally” deformed or extended by things that are invisible in the everyday situation, and so on. In this 8-qubit case, it’s claimed that the space of states has to be something qualitatively different which *does* violate the known observations because the known observations imply the laws of physics that inevitably hold for the 8-qubit situation as well – simply because there’s no conceivable variable that would become more extreme in the 8-qubit case and that would invalidate QM in this context while preserving its experimentally tested success in the well-known contexts.

LM

Comment #124 February 8th, 2013 at 2:49 pm

Great Truth (version III)

Even by a generous interpretation, it’s hard (for me) to extract useful lessons from Luboš’ most recent assertion.

The evolution of the concept of

entropyprovides an instructive case history. In classical physicsentropyis a well-posed geometric entity: the logarithm of symplectic volume of a level-set. And in quantum physicsentropyis given as a well-posed algebraic entity: von Neuman’s logarithmic trace. Yet it’s far from self-evident (to me) that the latter entropy is an “infinitesimally deformed” (in Luboš’ phrase) version of the former entropy.Q1Are thereanythermodynamical textbooks that evenattempta formal mathematical demonstration that these two definitions of entropy are (for practical purposes) equivalent?Q2Are there any texts that provide even a qualitative explanation of why this question hasn’t been easy to answer rigorously?———–

ConclusionOne lesson of history (as it seems to me) is that Lou Scheffer’s post #122 provides solid common sense guidance. Thank you, Lou, for that excellent post!Comment #125 February 9th, 2013 at 1:35 am

Dear John #124,

the reason why you don’t understand that von Neumann entropy is just the quantum deformed version of the log of the volume in the phase space is that you don’t understand basic physics.

The logarithm of the volume in the phase space is just the Shannon entropy from a statistical distribution that is uniform over the volume (and normalized) and the von Neumann entropy is nothing else than the Shannon entropy in which the probability distribution has been uplifted to an operator, just like everything in quantum mechanics. At any rate, the generalization is totally straightforward because the eigenvalues of the density matrix play exactly the same role as the individual values of the classical probability distribution on the phase space.

In both cases, the logarithmic formulae are multiplied by Boltzmann’s constant k, in order to get the entropy that was first extracted in the thermodynamic limit and that has values of order one in situations with macroscopic numbers of degrees of freedom.

If you don’t understand that all these formulae for entropy are really the same, you should repeat your basic undergraduate courses of statistical mechanics and thermodynamics instead of pretending that you are participating in discussions about cutting-edge physics.

Cheers

LM

Comment #126 February 9th, 2013 at 5:48 am

Wrap-up — Thank you all for your comments. It has been stimulating to interact on a site dedicated to a collective refutation of our papers on quantum computing and the irrotational motion of a compressible inviscid fluid.

My key take away: in order to convince the quantum computing community, we need to analyse the symmetries and interactions of two-body and many-body sonon spins, and show explicitly they obey the statistics in Bell’s original paper. It is not enough simply to observe that the relevant equations are the same as those of Cramer’s and Mead’s models and therefore they have this property.

And yes, it is reasonable for this community to ask for further progress in that direction. Time for us to roll up our sleeves — if others do not get there before us.

Thank you again

Comment #127 February 9th, 2013 at 9:21 am

“This is exactly what happened with QM – the old theory, with the old state space, gave results that contradicted experiment (there was no ultraviolet catastrophe, and atoms did not radiate until they collapsed). By reducing the state space to quantized values these problems were fixed”

Hold your horses compadre. This is pretty far off the mark. I think its pretty clear to anyone that the state space of observables hasn’t really reduced at all. I can arbitrarily boost any reference frame and make any portion of the state space accessible. What QM did was show that there are certain stable time independent solutions which have a somewhat privileged status. I would argue that there is a limit on our sampling of state space, but state space has shrunk in any sense.

Comment #128 February 9th, 2013 at 10:01 am

Hal Swyers #127: Yes, it’s an ironic feature of QM that it shrunk the “effective” state space of the orbiting electrons, only by dramatically expanding the “true” state space!

Comment #129 February 9th, 2013 at 10:12 am

Has anyone else noticed that Brady is using this as an opportunity to link to his papers as many times as possible to boost his SEO?

Comment #130 February 9th, 2013 at 10:41 am

Thank you for your generous advice, Luboš Motl, which exhibits a commendable verbal vigor! Now let us consider its foundations in mathematics and physics.

As was mentioned earlier (#70), it is regrettably true a great many natural dynamical systems are excluded from “undergraduate courses of statistical mechanics and thermodynamics.” E.g, when considering the classical-to-quantum

pushforwardit is natural to ask questions like: What is the Boltzmann/von Neumann/Shannon entropy of a (classical) ideal gas of Chaplygin Sleighs interacting by weak potentials? How can such systems be quantized most naturally?The dual set of quantum-to-classical

pullbackquestions is similarly rich, and here — as one example among hundreds — it is a fun exercise to read the titles of Guifre Vidal recent articles as provisional answers to wonderful questions:A real space decoupling transformation for quantum many-body systems(2012, arXiv:1205.0639); andEntanglement renormalization and gauge symmetry(2010, arXiv:1007.4145); andInfinite time-evolving block decimation algorithm beyond unitary evolution(2010, PRB v78p155117). Obviously very many more such recent articles could be cited!The thrust of comment (#70) is that the 20th century undergraduate mathematical curriculum leaves 21st century physics and engineering students ill-prepared to appreciate the burgeoning literature on classical-to-quantum pushforwards, and the dual literature of quantum-to-classical pullbacks … not to mention the literature whose dynamical state-spaces are more-than-Newton/less-than-Hilbert (

e.g.the tensor-product state-spaces cited above).What are the wonderful questions, to which the above-mentioned articles (and hundreds more) are providing us with enticing (but provisional and incomplete) answers? Aye, lasses and laddies, now *that’s* a central question for the 21st century STEM enterprise!

PropositionIt may eventuate that the state-space of string theory — whatever that state-space may be! — isnotthe state-space of Nature. As for whether Nature’s state-space permits fault-tolerant quantum computing (or does not permit it) this issue too is wholly undecided at the present time. Yet in all eventualities, the transformative advances in the capabilities of practical systems engineering — classical, quantum, and the emerging hybridized methods — that are associated to the mathematical naturality of thepushforward/pullbackmethods of string theory, and the mathematical naturality of theinformaticmethods of quantum information theory, are sufficient already to more-than-justify society’s investment in stringy quantum informatic arcanæ.Comment #131 February 9th, 2013 at 12:59 pm

Scott #128 No doubt! Certainly QM is better endowed than CM, it just like to flaunt the fact! I think future generations will look back and laugh at those advocating for smaller spaces and ask, “Well, how then would we make the cancellations?”

Comment #132 February 9th, 2013 at 1:22 pm

I think “Bell’s Theorem? ‘Tis but a flesh wound!” should be a new category on your blog. (though it would overlap pretty strongly with “Rage Against Doofosity”..)

Comment #133 February 9th, 2013 at 1:37 pm

couple of minor corrections…

in #125

“but state space has shrunk in any sense.”

is

“but state space has not shrunk in any sense.”

in #131

“it just like to flaunt the fact!”

is

“it just doesn’t like to flaunt the fact!

Comment #134 February 9th, 2013 at 2:58 pm

In order to convince the quantum computing community, we need to analyse the symmetries and interactions of two-body and many-body sonon spins, and show explicitly they obey the statistics in Bell’s original paper.Yes, especially if you also retract claims that experimental Bell inequality violations are illusory. Then you would convince the community that you’re not trying to contradict quantum mechanics. You wouldn’t convince anyone that quantum computers can’t work.

Comment #135 February 9th, 2013 at 4:59 pm

Why does not (pen+paper+QM textbook) count as a classical simulation of any QM system?

Comment #136 February 9th, 2013 at 5:34 pm

Amir #132: Thanks very much! I’ve adopted your suggestion.

Comment #137 February 9th, 2013 at 6:04 pm

The key is the efficiency of that simulation.

It is striking, that in all of human history, no-one has

evermeasured an experimental data-set that (under standard complexity-theoretic assumptions) provably cannot be simulated with computational resources that are polynomial in the bit-length of that data-set. Moreover, thanks to ongoing “Moore’s Law” advances in computer hardware and simulation algorithms, the principle “all feasible experiments are efficiently simulable” is nowadays striking true even ofquantum dynamical systems that formerly were considered to be computationally intractable.The Skeptic’s PostulateThe empirical simulability of feasible experiments reflects a law of nature that requires either fundamental modifications to QM, or else fundamental modifications to our appreciation of the experimental implications of QM (or both).The Enthusiast’s PostulateNo fundamental extensions of QM are required: it is necessary only that we be adequately ingenious in designing scalable means of fault-correction in quantum computing and/or scalable means of sourcing/sinkingn-photon quantum states in Aaronson-Arkhipov experiments (etc.). The ensuing unsimulable data-sets will experimentally demonstrate that the Skeptic’s Postulate is wrong!————

The

Shtetl Optimizedcomments (so far) shows us plainly that the weakest QM skeptical arguments are comparably unconvincing to the weakest QM enthusiastic arguments. As for convincingly strong arguments, none of the preceding 135Shtetl Optimizedcomments has demonstrated (to me) that either the Skeptics or the Enthusiasts have any!Comment #138 February 9th, 2013 at 6:30 pm

Why do you compare monarchists and segregationists to QM-deniers? The monarchists and segregationists may have been losing the political battle, but it is not a consequence of their political theory that their success is assured, so failure does not invalidate the righteousness of their opinions. This is in contrast with Marxism, which DOES assert its historical inevitability and is therefore falsified by failure. On the contrary, the monarchists can point to the failures of politics in democracies due to the shortsightedness of politicans who only see as far ahead as the next election and find their theories confirmed by experience of non-monarchism. The case of the segregationists is similar; you can argue against them on moral grounds, but it’s not obvious that experience has disproved their theories.

Comment #139 February 9th, 2013 at 7:24 pm

Joe Shipman #138: You make an interesting point. I suppose my analogy was based on the empirical fact that

most(though not all) firm believers in a political ideology,alsopredict a future where increasing numbers of people will agree with them—or at the least, they don’t predict that their ideology will nearly vanish from the face of the earth. (If theydidexpect that, then being the herd animals most humans are, they’d probably switch ideologies!) Even if they predict a huge “temporary” setback (e.g., losing a war), they typically also predict that far enough in the future, the world will come to see the martyrdom and heroism of their cause.For this reason, I submit that, while it’s not necessary as a matter of principle,

in factmost political ideologies are pretty tightly coupled to empirical predictions about the future of humankind: for example, “the world will come to see the rightness of superior races enslaving or exterminating inferior ones.” And many of those predictions have been pretty dramatically falsified. And those falsifications have indeed created huge problems for the modern “ideological descendants” of the people who made the predictions, at least if they care about history at all (many don’t).You’re right that all of this is

mostobvious in the case of Marxism (or, say, apocalyptic religions), which have included predictions—often falsified ones!—asexplicitparts of their ideology. (Arguably Nazism also counts, because of its explicit prediction of the “Thousand-Year Reich.”) But my claim is that even the ideologies that don’t include “explicit” predictions—e.g., liberal democracy, segregationism, monarchism—almost always contain “implicit” predictions that are accepted by almost all their adherents, as a major reason for subscribing to the ideology at all.Comment #140 February 9th, 2013 at 8:49 pm

Scott’s thesis that quantum skepticism≡monarchy is a Great Analogy … whose opposite therefore assuredly

alsois a Great Analogy. To appreciate this we ask:AnswerSix-of-seven were born citizens of monarchies (Planck, Einstein, Bohr, Heisenberg, Schrödinger, and Dirac). The sole exception is named by the Nobel website as “Prince Louis-Victor Pierre Raymond de Broglie” … a hereditary French title of nobility!When we remark that David Hilbert was himself born under the reign of Prussian monarch William I, and that

Nobel Prizes have always been presented by the reining Swedish Monarch(presently Carl XVI Gustaf!), the conclusion is unassailably evident:Comments (#44) and (#54) adduce more evidence for the Great Analogy of (quantum enthusiasm)≡(nurture by monarchy), yet surely the historical evidence already cited

will suffice to convince any “calm person”!Comment #141 February 9th, 2013 at 11:34 pm

Scott #139:

Your understanding of conservative intellectual sensibilities and ideology is seriously incomplete. Declinism, fatalism, etc. is default mode for a good chunk of the right. You can observe this for yourself at the famous end by noting the deep-rooted pessimism of Whitaker Chambers, who was sure he had switched to the losing side. You can observe it at the anonymous end by perusing blog comments on any anti-immigrant or social-conservative site. Excessive wallowing in gloom is actually a perennial vice that the more self-aware conservatives try to police. One reason Reagan made such an impact on the movement is that as a converted FDR Democrat he brought a dose of that optimistic happy-warrior spirit from the other side.

Comment #142 February 10th, 2013 at 3:05 am

srp #141: I’m well-aware of conservatives who wallow in doom-and-gloom prophecies; I even know the sort of fatalistic blog comments on social-conservative sites that you’re talking about. But I thought the whole appeal of a doom-and-gloom prophecy is the idea that, when the apocalypse finally arrives,

the world will see that you were right!Comment #143 February 10th, 2013 at 5:54 am

Scott #142:

I’m not sure who the post-apocalyptic audience would be for the I Told You So.. The really hard-core types believe in cyclical theories in which The Gods of the Copybook Headings are independently rediscovered after the decline and fall stage. But some do take satisfaction that they will be someday vindicated.

Of course,modern environmentalism has similar “after you’re all boiling/poisoned/missing the pretty biota/genetically mutated/Soylent Green you’ll see I’m right” tendencies. Civil libertarian types sometimes entertain post-police state ITYS fantasies, too. Human nature cuts across ideologies and philosophies.

Comment #144 February 10th, 2013 at 7:11 am

Concrete historical support for Scott’s thesis may be found in two very readable surveys of belief systems: Rabbi Abba Hillel Silver’s scholarly

A history of Messianic speculation in Israel from the first through the seventeenth centuries(1927) and Martin Gardner’s humorous (yet still scrupulously factual) works that includeFads and Fallacies in the Name of Science(1957) andUrantia: the Great Cult Mystery(1995). E.g. in Silver we read:Recent quantum computing works like the first-edition and second-edition QIST Roadmaps (LA-UR-02-6900, 2002 and LA-UR-04-1778, 2004) surely can be read as science, yet Scott’s proposition builds upon the great tradition of Silver and Gardner, in encouraging us to read the QIST Roadmaps

alsoas technological prophecy founded upon belief in the absolute physical reality of Dirac/Hilbert state-spaces.Is there an element of ideological/Messianic fervor among the most ardent enthusiasts for quantum computing? A faith sufficiently strong, that the evident shortfall of the QIST timelines cannot shake it? What are the consequences of subjecting faith to the trials of science? If it happens that the messiah of FTQC tarries, for how many generations should physicists retain their devout faith in the inerrant scripture of Hilbert and Dirac? And in particular, why are the strongest defenses of orthodoxy so commonly lacking in humility and humor? These are the trans-disciplinary questions that first Silver’s essays, then Gardner’s essays, and now Scott’s essays, encourage us to ask.

In Silver’s history we read Rabbi Jonathon’s “Perish all those who calculate the end, for men will say, since the predicted end is here and the Messiah has not come, he will never come”, and to make the same point more positively (and subtly), there is Maimonides’ creed “I believe with a full heart in the coming of the Messiah, and even though he may tarry, I will wait for him on any day that he may come!”

Does Maimonides’ creed apply to the QIST Roadmaps? Should it? These are terrific questions!

Comment #145 February 10th, 2013 at 9:04 am

scott #142, referencing srp #142

>> the whole appeal of a doom-and-gloom prophecy

I think the appeal is to ‘know’ that everything other people are doing is futile and foolish.

The best example imho is zerohedge.com – predicting financial doom-and-gloom since 2009 (when the financial crisis hit bottom).

Their followers witness other people (fund managers etc.) make lots of money (and they follow this in great detail), but they ‘know’ that in the end all will be lost, which explains why they are such a popular website.

Comment #146 February 10th, 2013 at 10:00 am

“scott #142, referencing srp #142″

Sorry, fixed

Comment #147 February 10th, 2013 at 1:29 pm

Oh no! Now you’re saying that you won’t be able to have the quantum and anti-quantum fanatics fight in a gladiator arena and cancel out each other because they’re on the same side?

Comment #148 February 10th, 2013 at 3:08 pm

Come to the Dark Side, we have cookies!

Comment #149 February 11th, 2013 at 12:13 pm

Excellent!

Aside:the fabledQuote Investigatorhas researched the fascinating origins and evolution of this fine saying.Yet on the other hand we have:

Comment #150 February 12th, 2013 at 5:28 pm

John Sidles, are you a currently practicing physicist?

Comment #151 February 13th, 2013 at 12:11 am

For the answer to Scott’s question to be “no”, then all forty-six of Gil Kalai’s tabulated objections to quantum computing (beginning

hereand endinghere) would have to bemutually exclusive… which on probabilistic grounds alone, scarcely seems likely!Gil’s list is wonderfully compatible (as it seems to me) with celebrated passages by Donald Knuth and Derek deSolla Price:

These passages motivate us to regard Gil Kalai’s forty-six skeptical avenues as intertwining paths — some paths surely more promising than others … but which? — that lead generally toward the conception (in deSolla’s idiom) of a 21st century “synthetic revelation” that extends “the explicandum of quantum dynamics” with sufficient rigor that (in Knuth’s idiom) “the present Art of quantum dynamical simulation becomes a Science.”

The Knuth/Price considerations lead us to reflect that perhaps it scarcely matters — for the next few decades anyway — whether the quantum dynamical state-space of Nature is

absolutelynon-Hilbert/Dirac versuseffectivelynon-Hilbert/Dirac (to answer Scott’s question by citing two distinct-yet-compatible grounds for quantum computing skepticism).In either eventuality scalable quantum computing is infeasible (or is it?) … and yet many other capabilities — that are similarly wonderful, and possibly are more strategically important than quantum computing for a crowded overheating planet — are associated to various subsets of Gil Kalai’s forty-six skeptical possibilities.

ConclusionIt is a truth universally acknowledged that various combinations of Gil Kalai’s forty-six skeptical avenues represent eminentlyhopefulpaths for the future of quantum research.Comment #152 February 13th, 2013 at 4:52 pm

Such questions presuppose that there exists a distinct boundary between science and engineering, and yet this boundary isn’t easy to specify: did

von Neumann write to Wiener(in 1946) as a fellow-scientist or as a fellow-engineer?To the extent that 21st century scientists and engineers are advancing toward increasingly overlapping objectives, that are described by increasingly overlapping mathematical languages (and are pursued by increasingly overlapping enterprises) perhaps the science-versus-engineering question — for many (most? all?) quantum researchers — is amenable to the same answer as was recounted by Lyndon Johnson, who told of a student teacher who was asked in a Texas job interview: “Do you teach evolution the Bible way or the Darwin way?” … to which the eager job-seeker answered: “I can teach it

eitherway!”ConclusionStrict-constructionists can reasonably argue that workers who are largely or entirely confident that the state-space of Nature is rigorously Hilbert/Dirac should self-describe as engineers. John Bell was among the first to embrace this practice (in 1983) … there have since been many more.Comment #153 February 16th, 2013 at 4:03 am

Scott: what is the point of critiquing and debating quantum skeptics with whom you disagree so fundamentally? We sometimes think that debate will lead to greater understanding (when, for instance, your opponent is Gil Kalai), but you don’t think anything like that of a lot of these crackpots. So work towards the quantum computer. When you can crack RSA codes, these skeptics will look like the Flat Earth Society.

Comment #154 February 16th, 2013 at 1:16 pm

Sam H.,

I think you have missed the point.

SO is largely reports from Scott as he tries to figure out the landscape on the QC frontier. Debating fine points with John, Greg, Gil, Lubos, and other insightful participants is no doubt a fun part of the exercise.

Scott also takes on his share and more of refuting doofosity. Many avoid this chore, which is rather like pushing a garbage truck up a hill. Scott appears to be about 50/50 on sparing with QC skeptics and QC true believers.

That is usually a good place to be in a debate, when both sides attack you.

Comment #155 February 17th, 2013 at 11:11 pm

No blog posts for 2 weeks?! I keep checking but no luck.

Comment #156 February 18th, 2013 at 10:54 am

“

The lamps have been going out“, not only here atShtetl Optimized, butthroughout the quantum blogosphere.Has the 20th century’s

explicandumof quantum information theory become too narrow to sustain viable 21st century STEM enterprises? Are the associated quantumexplicandaeinsufficiently inspiring and/or enabling and/or natural?The remedies to these maladies are well-known: lively articles, enabling technologies, novel

explicandae, new mathematical frameworks, and provocative posts and comments!Comment #157 February 18th, 2013 at 7:59 pm

Note that Brady didn’t actually answer whether factoring 17*19 would invalidate his model or why he spent so much time denying the validity of the numerous experiments demonstrating Bell Inequality violations if he accepts their results.

Comment #158 February 20th, 2013 at 6:34 pm

hi scott, you retrograde luddite you. think you’re spectacularly on the wrong side of this issue, but it could take decades to prove it. however, there is some recent physics/cosmology results by Beane that arguably tie into this that you should be aware of. do you have any comment on those elsewhere?

hot off the press, a rebuttal that cites ‘t hooft, wolfram, cellular automata, solitons

Comment #159 February 21st, 2013 at 10:26 am

[...] Link. I missed this somehow. He left smoking cinders. Miss his writing, he’s spending too much time with his baby. (that was a joke) [...]

Comment #160 February 21st, 2013 at 11:56 am

Bram #157: If our model is correct then it will be an order of magnitude harder to factor numbers like 17*19 in a simple geometry. And another order of magnitude harder with larger numbers, and so on. I hope the reason for this is clear from the papers.

It is an experimental fact that there are violations of the Bell inequalities. This is usually interpreted to tell us that particles have a delocalised character. As you will know, it is common in fluid dynamics for there to be structures with a well-defined position whose energy is delocalised. A vortex is one well-studied example, and the sonon quasiparticles are another.

In this blog it appears to be assumed that quasiparticles in fluid dynamical systems cannot violate the Bell inequalities, even though their energy is delocalised. I should very much like to understand why this is believed. Could any of the contributors to this blog provide a reference? Thank you.

Comment #161 February 21st, 2013 at 1:14 pm

Another voice is heard:

There is a discussion of Weinberg’s view on the

Quantum Frontiersweblog that (as it seems to me)deserves more comments than it is receiving.Comment #162 February 22nd, 2013 at 3:07 pm

yet another reason not to want a world with

a small limited number of qm-proper particles

http://www.smbc-comics.com/comics/20130222.gif

Comment #163 February 24th, 2013 at 3:45 pm

Thomas Vidick’s weblog

MyCQStateis presently hosting an outstanding essay/survey — written by Thomas himself — that is titled.A Quantum PCP Theorem?That the Anderson/Brady preprint has received 150+ comments, while Thomas’ excellent essay has (thus far) received no attention at all, is an imbalance that (as it seems to me) we can all help to remedy.

Comment #164 February 24th, 2013 at 6:22 pm

The semi-regular announcements of QC breakthroughs recounted in ScienceDaily got off to an early start this week:

http://www.sciencedaily.com/releases/2013/02/130224142829.htm

One thing that makes this week’s alleged breakthrough unusual is that there are some actual “QM looking” equations given in a figure. But — they are blurry so you can’t quite see what they are trying to say. If you click the “enlarge” button, the equations get bigger, but they are still blurry. Could this be a deep metaphor?

Comment #165 February 26th, 2013 at 5:23 am

Robert #160: I don’t pretend to be an expert on sonons, but one of your presentations suggests that they are “incapable of quantum collapse”, so in particular the measurement postulate does not hold and it’s not clear that sonons can appear in a superposition state, let alone exhibit entanglement. As such, they don’t appear “quantum” so would not be expected to violate a Bell’s inequality.

And i emphatically disagree that such violations are interpreted as showing the particles have a “delocalized character”. All they show is that quantum mechanics is incompatible with local realism. Big difference.

Comment #166 February 28th, 2013 at 8:48 am

Dear Prof. Aaronson, for sure, nobody will ever find that the “whole framework of exponentially-large Hilbert space was completely superfluous”!

However, I would like to ask you (and to the readers) your idea about the fact that BQP is a subset of PSPACE. It seems that something huge is needed, but not necessarily a “large Hilbert space”: a very long calculation time can do the work. Do you think that this can say something fundamental on what QM is? Not an easy equation in a huge Hilbert space but an extremely difficult problem in a smaller space?

Comment #167 February 28th, 2013 at 11:34 am

Brady #160 writes: “If our model is correct then it will be an order of magnitude harder to factor numbers like 17*19 in a simple geometry. And another order of magnitude harder with larger numbers, and so on.”

By ‘order of magnitude’ do you mean factor of 2 or factor of 10? By ‘harder’ do you mean more energy, or more precision, or something else? By ‘larger’ do you mean twice as large, or above some threshold, or what?

“In this blog it appears to be assumed that quasiparticles in fluid dynamical systems cannot violate the Bell inequalities, even though their energy is delocalised. I should very much like to understand why this is believed.”

Fluid dynamical systems are based entirely on local phenomena. See every paper on simulating them ever written. Anything which vaguely looks like action at a distance will have to operate by going through the intervening materials, and the speed of propogation of the effects will be limited by the speed of light.

Comment #168 March 1st, 2013 at 5:29 pm

In token of respect and gratitude for recent quantum-informatic blogosphere posts by Gil Kalai, Aram Harrow, Thomas Vidack, John Preskill, and the Aaronson/Arkhipov collaboration (and other folks too!), I have posted to

Gödel’s Lost Letter(what attempts to be) aunitary appreciation of their various perspectivesin relation to the 2014 Simons Institute workshop… which looks like it will be aQuantum Hamiltonian Complexityterrificworkshop!Comment #169 March 4th, 2013 at 3:32 am

Comments?

http://www.sciencerecorder.com/news/scientists-discover-a-way-around-heisenbergs-uncertainty-principle/

Scientists discover a way around Heisenberg’s Uncertainty Principle

According to a pair of scientists from the University of Rochester and the University of Ottawa, there may be a way around Heisenberg’s famous Uncertainty Principle.

According to a report published this week in Nature Photonics, a recently developed technique that allows scientists to directly measure the polarization states of light could be the key. The direct measurement technique, developed in 2011, allows scientists to measure the wavefunction – a way of determining the state of a quantum system.

The pair of scientists say the new technique relies on a “trick” that measure the first property of a system, leaving the remaining parties untouched. The careful measurement relies on the “weak measurement” of the first property followed by a “strong measurement” of the second property, the pair writes in the report.

Comment #170 March 12th, 2013 at 7:48 pm

Bram #167 I appreciate your questions and comments.

Yes, Euler’s equation contains only local interactions. Nevertheless, the energy and angular momentum of a vortex are delocalised in the fluid. I believe this means a vortex has at least some properties which are not localised at the core. A sonon has the same delocalised properties.

Would you be convinced by an explicit proof of the spin correlation in Bell’s original paper (which he shows violates his inequality)?

I am afraid I can’t quantify how ‘hard’ it would be to break the current experimental glass ceiling. You would have to get a single particle to lose coherence with system A, fall into coherence with B, revert to A and so on, with the net effect that it remains in coherence with both. I think this would be exceedingly difficult, but it is at least mathematically conceivable. If someone can achieve it they might be able to break the glass ceiling. If it’s not clear why, the presentation (on my web site) might be helpful.

Comment #171 March 13th, 2013 at 6:40 am

Rahul #169: Weak measurement is a decades-old idea. And it doesn’t in any way, shape, or form violate the Uncertainty Principle (as nothing can, without violating QM itself—in which case you would’ve heard about it!). In the case of weak measurement, the “catch” (i.e., the one crucial fact popular articles never tell you) is that you need an ensemble of many copies of the system to implement the measurement.

YAWN…next!Comment #172 March 13th, 2013 at 6:51 am

Doriano #166:

I would like to ask you (and to the readers) your idea about the fact that BQP is a subset of PSPACE. It seems that something huge is needed, but not necessarily a “large Hilbert space”: a very long calculation time can do the work. Do you think that this can say something fundamental on what QM is? Not an easy equation in a huge Hilbert space but an extremely difficult problem in a smaller space?

Actually yes, I’ve been telling people for a while that BQP⊆PSPACE is a deep and underappreciated fact about the foundations of quantum mechanics! (One of my laugh lines is that Feynman won the Nobel Prize in physics basically for pointing out that BQP⊆P

^{#P}⊆PSPACE—i.e., that you can organize QFT calculations as a giant sum rather than keeping a whole wavefunction in memory.)On the other hand, I don’t see this as a challenge to the Hilbert space formalism, but as a

propertyof the formalism: a property of “modesty,” if you like. We never observe a naked state vector in the wild; we only ever observe the outcomes of measurements. And if you only care about predicting the outcomes of measurements specified in advance, you can ditch the notion of “states” almost entirely, and organize your calculations in a more efficient way (just how muchmore efficient being an active research topic). But as soon as you ask for the “state” of the system — i.e., for an object sufficient to probabilistically predict the outcome of anypossiblemeasurement that could be made in the future — the exponential character of Hilbert space comes roaring back.Comment #173 March 14th, 2013 at 7:08 am

Unless the dynamical system couples to a continuum of vacuum states, or (equivalently?) a thermal bath, or (equivalently??) is a product-state pullback. For some reason (

yet what might that reason be?) Naturerequiresthat both her external reality and human laboratory experiments respect these coupling-to-continuum constraints. That’s why it’s been heartening in recent years (for us system engineers) to witness the gradual weakening of theoretical faith in the absolute reality of unitary evolution on finite-dimensional Hilbert spaces!Comment #174 March 15th, 2013 at 4:28 pm

Brady, a vortex is not a particle, it’s a phenomenon across a whole area, like how a sound is a pattern of pressure or a differential in temperature is a potential energy source. The point of intersection of the two blades of a scissors can move forward faster than the speed of light, but that isn’t a violation of the speed of light limit, because that point isn’t a particle, it’s a phenomenon which changes what particles it’s talking about over time.

Comment #175 April 3rd, 2013 at 4:34 pm

Bram #174. Yes. Well put. A vortex is able to escape Bell’s inequality because it is a phenomenon across a whole area. But it has a duality. An ideal vortex is completely characterised by its central position and circulation, and so it can be (and, in fluid mechanics, is) treated like a 2-D particle. Sonons have the same duality, in 3D.

To settle this, would you accept an explicit demonstration that sonon quasiparticles have spin-half symmetry and behave precisely like the quantum mechanical particles analysed in Bell’s original paper, including violating Bell’s inequality?

Comment #176 November 17th, 2013 at 7:48 pm

“Bell’s Theorem? Just a flesh wound!”

Scott. You seem like a smart guy who knows his way around QM, so I have a few questions about the “Bell’s theorem” proof.

Bell main criticism of von Neumann’s no-go theorem is as follows:

“The essential assumption can be criticized as follows. At first sight the required additivity of expectation values seems very reasonable, and it is rather the nonadditivity of allowed values (eigenvalues) which requires explanation. Of course the explanation is well known: A measurement of a sum of noncommuting observables cannot be made by combining trivially the results of separate observations on the two terms — it requires a quite distinct experiment.”

Yet in every “proof” of Bell’s theorem I’ve come across, expectation values from QM are simply combined linearly in an inequality expression (which is valid BTW) to claim violation. So when Bell wrote in is argument against von Neumann that:

“It was not the objective measurable predictions of quantum mechanics which ruled out hidden variables. It was the arbitrary assumption of a particular (and impossible) relation between the results of incompatible measurements either of which might be made on a given occasion but only one of which can in fact be made.”

Why is this not also a criticism of Bell’s own theorem? How can Bell’s theorem be valid if the proof relies on a linear combination of expectation values, of incompatible measurements contrary to the principles of QM?