Scott is correct, both as a general observation and as a blog-moderation policy. I do tend to slip into what others see as name-dropping. It does not matter that I see it as attempting to pass on the previous unwritten oral history of an unlikely but actual number of great minds that I have had the chutzpah to have gotten to know personally. I apologize.

Re: #45

John Sidles asks a fascinating question. I have little to contribute, because I am still working through Terry Tao’s blog explanations of why Navier-Stokes is so gosh-darned hard.

Except maybe to comment that all turbulent fluids ARE performing computations, by the unproven inclusive definition of Stephen Wolfram in A New Kind of Science.

The Pentagon way would be to have a “fly off” between a billion dollar fluidic computer and a billion-dollar quantum computer, and award an astronomically large contract to the contractor team who built the winning demo.

Also, to note that quantum liquids don’t seem to have a Navier-Stokes problem. See the overview by A. J. Leggett, Science, 319(29 Feb 2008)1203-4.

]]>If the same question were asked about *fluid* mechanics then it would be reasonably clear how to answer. Everyone knows that fluids are fundamentally made out of atoms … but it is not practical to track every atom … and so some form of model order reduction is required … a plethora of techniques have been developed to achieve this … these techniques include PDEs (like Navier-Stokes), finite element methods, cellular automata … the list is immensely long … the ingenuity of fluid dynamical researchers is unbounded … and the resulting creative opportunities in math, science, and engineering are ever-expanding.

So we can ask, how is *quantum* mechanics different than fluid mechanics? Here we may hope to elicit some enjoyable differences of opinion! đź™‚

To keep it short, here are three comparative assertions about fluid-versus-quantum mechanical systems … and I will argue that all three can be plausibly rejected.

RESOLVED:Fluid systems cannot carry out computations, but quantum systems can.

Here I would argue the negative, on the heuristic grounds that proving that turbulent Navier-Stokes flows are not carrying out (encoded) computations would be a big step toward winning the third Clay Millenium Prize! It is true that *most* fluid systems are not carrying out computations … and precisely the same is true of *most* quantum systems.

RESOLVED:Fluid systems have a natural reduced-order state-space (namely, finite fluid elements), but quantum systems don’t.

Here I would again argue the negative. The practical experience of the larger science and technology community is that multilinear algebraic varieties work provide a natural reduced-order state-space for quantum systems (and I refer folks to the highly readable articles of Gregory Beylkin and Martin Mohlenkamp on this subject). Pretty much every quantum research discipline since the 1920s has used multilinear varieties as a preferred state-space, but it seems that no two communities call them by the same names.

RESOLVED:Generic fluid mechanical systems can be simulated in polynomial space and time, but generic quantum mechanical systems can’t.

Here for the third time I would argue the negative … provided “generic” is understood to mean “noisy and/or open.” Based on the preceding two principles, the key to practical quantum simulation efficiency evidently is to choose a noise (and/or measurement) unravelling that allows the simulation to use compressed state-representations (like the above algebraic varieties) as the fundamental computational elements.

So the direct answer to your question, Michael, is that with the proper choice of formalism, quantum systems are generically *easy* to simulate (meaning, not particularly harder than fluid systems … or 3D+time general relativity … which in practice means “mighty tough” to simulate).

My main regret in the above assertions is that the resulting overall picture is wholly consistent with the quantum orthodoxy of textbooks like (e.g.) Nielsen and Chuang.

But fortunately there are plenty of non-orthodox questions left-over to speculate about. For example, we know that atoms are the building-blocks of fluid mechanics … but do we really *know* that (linear) Hilbert spaces are the building-blocks of quantum mechanics?

John,

You said that “[i]nterpretations of quantum mechanics vary hugely in their practical simulation efficiency . . .” Could you please explain in simple terms đź™‚ what you’re referring to? Thanks.

]]>Personally, I’ve never really understood what induces people to stay on the fence about these things even after evidence has piled up for a decade or more. Why *shouldn’t* there be another type of matter, or a vacuum energy? Both not only fit the data, but are perfectly compatible with everything else we know about the laws of physics (unlike many alternative ideas that people seem willing to entertain).

Following the Announcement of Opportunity in late 2008, the science investigation will be selected in 2009, with a project start that same year. The launch of JDEM is planned for 2014-2015, and a nominal science operations lifetime of three years is assumed, although not required.

I confirmed when he phoned me yesterday (on an unrelated matter) that Michael Salamon NASA HQ/ SMD / Astrophysics is in charge of the mission from the NASA side, but, given his being heard recently on National Public Radio and quoted in the Science section of the New York Times, I’d prefer not to say anything which could be politically uncomfortable for him and his colleagues.

Until that data pours in, let a thousand theoretical flowers bloom.

]]>Re: DMDEI (dark matter, dark energy, inflation)

I hear what you are saying, but these might prove to be â€śI can calculate, therefore it happenedâ€ť bandwagons. Careful modeling of galactic dynamics and cosmic evolution produce results at odds with observation. The only thing certain is that something is wrong. Maybe we donâ€™t understand the physics; maybe we are missing a major player, whatever. The natural response is to postulate a hidden (particle, force, phase transition, whatever) that fixes things. In some cases parameter fiddling produces a prospect (DM, DE, I, â€¦) that is consistent with observations. The fun part is figuring out which are real and which are artifices. Early adaptors think the contest is over, but at this point direct evidence is rather scant. While certainly not a DMDEI expert, I can summarize a skeptical viewpoint.

Dark matter. Under GR, the mass we see is insufficient for the observed dynamics of galaxies and galaxy clusters. The simplest explanation is undetectable massive particles that make things add up. Other horses in the race: corrections to GR that donâ€™t show up at solar system scale, lots of mini black holes and/or Jupiters floating about, etc. Evidence is growing that DM actually exists; gravitational lensing analysis of background galaxies seem to show DM outrunning the light matter in cluster collisions. Odds for DM: Pretty good.

Dark energy. Measuring change in the rate of expansion of the universe is hardly Lab 101. A long chain of models, calculations, and simplifications (among the more suspect: â€śwouldnâ€™t it be nice if all the bright supernovas were the sameâ€ť) suggests a total rethinking of Cosmology. On one hand, there does seem to be a fairly robust signal. On the other hand, astronomy.com reports a biggest yet hypernova, gamma ray burst, whatever, about once a month, so the key â€śuniform supernovasâ€ť ansatz might prove to be wishful thinking. Odds for DE: maybe 50/50.

Inflation. Integrating the differential equations of cosmic evolution backwards from (t=today, what you see) to (t=0, whatever) runs into problems. Likewise if you integrate forward from (t=0, some guess) to (t=today, what you see). Not a surprise given that GR is way singular at infinite density, and no one has any idea about the physics in this parameter range. Anyway, this integration uncovers the â€śsmoothness problemâ€ť (the universe today appears too smooth for certain guesses about the smoothness near t=0), and others. When a DE encounters problems at singularities, one looks for an ad hoc band aid to patch things up. This works fine for shock waves, etc. The inflation phase transition allows one to start an integration slightly closer to t= 0, and finesse the smoothness gap. Odds that inflation actually happened: anyoneâ€™s guess; sounds like wishful thinking.

Postscript. While enjoying a few beers with David Schramm about 20 years ago, I asked if he actually believed in inflation, or just enjoyed the theorizing. This lead to a discussion of how easy it is to believe in whatever you are working on. He said he thought inflation was more likely than not.

]]>“The theorems of Kolmogorov-Chaitin complexity guarantee that

notjust quantum systems, butanysystem that encompasses an exponentially large number of measured outcomes (loosely speaking), necessarily will encompass randomness in those measured outcomes.

And major remorse for not posing the debatable topic:

RESOLVED:The description of quantum measurement in terms of von Neumann-style projection operators should be abolished from undergraduate courses in quantum mechanics.

I’m happy to argue the affirmative … and hopefully this respects Scott’s request that we “work with him!” đź™‚

]]>With regard to non-commutative probability, it’s on my list of things to learn, but not as high on that list as Scott’s quantum learnability theorems … but the plain fact is that at present I’m pretty completely ignorant of both.

When it comes to quantum randomness, it seems to me that the ideas of Kolmogorov-Chaitin complexity are reasonably satisfactory.

E.g., the interferometer in our MRFM experiments delivers a binary stream of bits at about 10^12 bits/second. Heck, *of course* these data records are incompressible in leading order (meaning random). They *have* to be. Because the theorems of Kolmogorov-Chaitin complexity guarantee that just quantum systems, but *system* theory that encompasses an exponentially large number of measured outcomes (loosely speaking), necessarily will encompass randomness in those measured outcomes.

So for us engineers, the question “How Big Are Quantum States” boils down to “How Compressible Are Quantum States?” And this means compressible not only in principle, but in practice. After all, the above-mentioned MRFM interferometer is a *machine* for creating compressible states … it is *designed* to create back-action that drags the quantum state into agreement with the measurement record … because that is how measurement *works* … measurement processes reflect *design* … whether that design is via cognition or via Darwinian evolution. đź™‚

For most practical purposes—quantum computation being the main exception—it is highly desirable that quantum states be compressible, *and* that computations be feasible in the compressed representation. `Cuz duh, “Whereof one cannot speak, thereof one must be silent” applies in science and engineering just as much as in philosophy.

That is why quantum physicists and quantum chemists and quantum system engineers spend pretty much all their time talking and/or computing with compressible quantum states that have computable representations.

This has created an embarrassment of riches: the empirical methods that people use in practical quantum computations are so numerous and various as to form a dense fog of literature that makes it hard to discern the outlines of the informatic subject that we are all *really* studying.

Therefore—from purely practical motivations—I am particularly interested in fog-dispelling informatic tools, and I am hopeful that Scott’s notions of quantum learnability can be added to humanity’s fog-dispelling tool-set.

]]>