From the other point of view it looks like a lot of opinion means for simulation to be a kind of affairs in which existence of a simple model is possible ( which is very general statement) and when such model is discrete ( which is very limiting statement). This connect two completely different levels of rigidity: very general ( model is possible and effective) with very strong one ( is computable or even discrete).

Ther pe is a lot possibilities in between. Even such simple equation like classical wave equation has uncomputable solutions! Backenstein bound, which is very smart calculation, is in contradictory with simple experiment: from simple hammer to LHC if we out more energy, we get more degrees of freedom excited. It is 10^10 levels of size scale truth. If you say in finite volume there may be only finite energy and you say there is limited entropy ( black hole) and the matter existing only as bosonic and fermionic states, then you get the bound. But what if it is a statement about states of the matter only, and not about space itself? What I’d there are processes we do not know, and in principle you can excite even more degrees of freedom? We are reasoning about something in a scale 10^(-23) basing on something on a scale 10^-(10). It is very weak hipothesis.

The third though is as follows: suppose we have very complicated system we can simulate on such detailed level, it simulates all details ( that is a model is exactly the same as reality which is very strong statement, meaning that at some level model is indistinguishable from reality. In other words, there is such level of reality which is completely compressible into simple equations and no other aspects of reality on this level are possible). Then such system ( “ultimate computing machine simulating complete reality on ultimate level of complexity”) could be simulated. Would be possible any compression in such second level simulation? Answer is probably no ( because if it would be possible, further simplification of model of the reality will be possible, do first simulation will be no exact reality but only simplified model of it, just like for pendulum). So we would have a system for which simulation and a reality will be in the same level of complexity and requires the same level of resources. From practical point if view then, if something may be simulated, eventually there must be a level where simulation and it’s model are indistinguishable in the meaning of complexity.

https://people.eecs.berkeley.edu/~christos/classics/Feynman.pdf

]]>Different approach would be to drop this assumption and seek physical laws that are be artifacts from the simulation or convenient optimizations that limit available computational requirements. Some of the unresolved mysteries might be jitter, wander, rounding and aliasing errors from the simulation.

For example: Universe is Newtonian with quantum mechanics and general relativity is just elegant way to add distance fog.

]]>http://www.bizjournals.com/sanjose/news/2017/03/28/rigetti-quantum-computing-y-combinator-a16z.html

]]>**Ethan** (#159) “All matter is quantum matter.”

Ethan’s principle extends to “All *computational* matter is quantum *electrodynamical* matter”, and this comment will argue that the extension is both natural and inspirational.

Specifically, this extension invites us to “parse carefully and thoughtfully” (per #157) the introduction to Nielsen and Chuang’s ground-breaking textbook *Quantum Computation and Quantum Information* (2000), which inculcates the following quantum worldview:

1.1.1 History of quantum computation and quantum information… What is quantum mechanics? Quantum mechanics is a mathematical framework or set of rules for the construction of physical theories. For example, there is a physical theory known asquantum electrodynamicswhich describes with fantastic accuracy the interaction of atoms and light. Quantum electrodynamics is built up within the framework of quantum mechanics, but it contains specific rules not determined by quantum mechanics.The relationship of quantum mechanics to specific physical theories like quantum electrodynamics is rather like the relationship of a computer’s operating system to specific applications software—the operating system sets certain basic parameters and modes of operation, but leaves open how specific tasks are accomplished by the applications.

The rules of quantum mechanics are simple but even experts find them counterintuitive, and the earliest antecedents of quantum computation and quantum information may be found in the long-standing desire of physicists to better understand quantum mechanics.

Seventeen years have passed since these words appeared, and at QIP 2017 significant new readings were in evidence. To appreciate these new readings, lets parse the above three paragraphs in reverse order, starting with “the long-standing desire … to better understand quantum mechanics”.

To begin the parsing, there’s no shortage of literature establishing that the Nielsen and Chuang-asserted “mystery” of quantum mechanics reflects (at least partially) deficiencies in our understanding of classical mechanics. E.g. Jeremy Butterfield’s “On symplectic reduction in classical mechanics” (*Philosophy of Physics* 2005), and the references therein, provide a start in understanding the mathematical framework and the historical evolution of the essays that Terry Tao’s weblog groups under the tags ‘Navier-Stokes equations’, ‘Euler equations’, and ‘finite time blowup’.

Moreover the mathematical toolset that Butterfield and Tao apply to classical dynamics systems, has become essential to ongoing research in quantum gravity, emergent spacetime, and many other topics of interest to *Shtetl Optimized* readers.

In particular, the Butterfield/Tao work invites to regard the (zero viscosity) Euler equations as homologous to (qubit-informatic) quantum mechanics, and the (finite viscosity) Navier-Stokes equations as homologous to (condensed matter) quantum electrodynamics.

In this light, the second paragraph of Nielsen and Chuang extends to a form that more aptly describes the range of research presented at QIP 2017:

The relationship of quantum electrodynamics (QED) to idealized physical theories (like quantum information theory) is rather like the relationship of a computer’s operating system to specific applications software — the operating system sets certain basic parameters and modes of operation, but leaves open how idealized computational capacities are to be demonstrated (or not) in practice.

To extend the metaphor, our universe’s QED “operating system” — which is the sole operating system that our experiments can “boot” into — imposes stringent and (at present) poorly-appreciated limits on the dimensionality and evolution of physically realizable dynamical trajectories.

In a nutshell, the Nielsen and Chuang “operating system” metaphor is more useful nowadays if we regard QED as the operating system that, perforce, is our sole platform for demonstrating Quantum Supremacy.

Appreciated in this light, the body of research presented at QIP 2017 provides (read in the above light) provides a well-framed answer to a question that Scott posed back in his article “Multilinear formulas and skepticism of quantum computing” (arXiv:0311039, 2003); and article that was written shortly after the Nielsen and Chuang text appeared. This 2003 article asked:

“Exactly what property separates the quantum states we are sure we can create, from those that suffice for Shor’s factoring algorithm?”

Numerous works at QIP 2017, including in particular Garnet Chan’s plenary lecture “Simulating quantum systems on classical computers” (as discussed in #157) provide a QED-dependent answer

Mathematically, the separatory property is small tensor rank; the physical mechanism is that QED unravellings restrict to small-rank varietal state-spaces.

Stated concisely and bluntly, an emerging and all-too-credible skeptical answer to Scott’s question postulates that Quantum Supremacy fails in QED universes because the Extended Church-Turing Thesis is true.

Two notable virtues of this nuanced, skeptical, QED-centric worldview (as it seems to me) are: (1) it inspires us (students especially) to read deeply and integratively in the marvelous STEAM literature of our age, and (2) it inspires to appreciate, with equal respect, the works of a cadre of visionaries who have conceived the notion of Quantum Supremacy, concomitantly with the works of a cadre of visionaries who are providing explicit, diverse, and cumulatively strengthening reasons to appreciate why the infeasibility of Quantum Supremacy would be comparably marvelous to its feasibility.

]]>If the universe has N particles in it, what’s the higher bound on how many qbits can be realized? (I would guess that a significant portion of the resources have to be dedicated to error correction, etc). ]]>

**Scott** imagines (in #156) “A universe with a finite-dimensional Hilbert space … with \(|x,a\rangle \to |x,a\oplus f(x)\rangle\), where \(f(x)\) encodes whether Turing machine \(x\) halts”

This is a universe in which Quantum Supremacy holds paradigmatically; so let’s ask all the questions we can about it.

**Q** What is the Hamiltonian of this universe?

For concreteness, imagine a universe of fixed dimension, supporting (say) \(10^9\) qubits, whose unitary evolution in some fixed finite time realizes the Halting Oracle. We conceive these cubits as a cubical array, \(10^3\) qubits on a side, as a Nature-supplied substance that we will call “pythium” (after Pythia, who was the oracular High Priestess of the Temple of Apollo at Delphi *circa* 700 BC).

By a finite computation, given the specified unitary evolution of the pythium during the specified oracular computation-time, we can calculate pythium’s Hamiltonian.

By arguments that this comment will not belabor, we appreciate that the pythum Hamiltonian \(H_P\) resembles Chaitin’s constant: it’s a well-defined mathematical entity whose properties we can imagine, but not concretely inspect or physically realize.

The properties of \(H_P\) stand in striking contrast to Nature’s locally-interacting photon-radiating gauge-invariant locally-relativistic informatically-compressible QED Hamiltonians, specifically in that \(H_P\) is (as far as we know) not sparse, not radiative, not gauge-invariant, not relativistic, and not informatically compressible.

In a nutshell, the same mathematical traits that make pythium’s Hamiltonian well-suited to demonstrations of Quantum Supremacy, are traits that Nature’s QED Hamiltonian conspicuously lacks. This helps us appreciate why (in QED universes) Quantum Supremacy seems (at present) to be exceedingly difficult to demonstrate (even effectively), while in contrast the Extended Church-Turing thesis (at present) seems entirely feasible to demonstrate (at least effectively).

For an in-depth description of effective demonstrations of the Extended Church-Turing thesis — valid solely for the locally-interacting photon-radiating gauge-invariant locally-relativistic informatically-compressible QED Hamiltonians that Nature supplies — see (for example) Garnet Chan’s plenary lecture at QIP 2016 “Simulating quantum systems on classical computers” (slides here, video here).

As the session chair Peter Shor said (sagely, as it seems to me) in his introduction to Chan’s QIP 2017 lecture

It’s my honor today to be able to introduce Garnet Chan, from CalTech. Professor Chan is one of the world’s experts on simulating quantum chemistry and quantum condensed matter on classical computers.

So in some sense that makes him our competitor, because what I think a lot of you would like to do is find algoritms on quantum computers that are better and faster than Professor Chan’s.

But we’re both working toward the same ultimate goal, which is understanding how quantum matter works.

Peter Shor’s remarks, and QIP 2017’s honoring of Garnet Chan with a plenary lecture, outstandingly exemplify (for me at least) a principal principle of quantum informatic discourse: that Quantum Supremacists and Quantum Skeptics should *both* focus their attention and energies upon the *best* attributes of the *entire* body of quantum informatic research.

That is why distilling the *best* attributes of Scott’s remarks here on *Shtetl Optimized* (or indeed anyone’s remarks) requires the excision of snark, the redaction of sardonicism, and the ignoring of rhetorical flourishes.

In particular, rhetorical flourishes like “accept no substitutes” (in regard to demonstrations of Quantum Supremacy) are most valuable when we parse them exceedingly carefully and thoughtfully, so as not to injudiciously undervalue contributions like Garnet Chan’s (or indeed DWAVE’s).

That is why (for me at least) Peter Shor’s introductory remarks at QIP 2017, and Garnet Chan’s subsequent lecture, hit the sweet spot in respect to our shared 21st century quest to appreciate Quantum Supremacy.

]]>