Intuitively, P≠NP seems true. If P=NP then it seems you could indeed in effect have a fixed set of equations that ‘solve’ creativity, thus reducing cognition to pure mathematics.

P≠NP seems to imply the existence of what I call the ‘cognitive arrow of time’ , whereby creativity can’t be reduced to fixed equations, but always needs imperfect heuristics that can evolve. This makes cognition distinct from pure mathematics (heuristics vs. fixed equations).

Applying my famous super-human intuition, I guess that the proof is closely related to geometry …. computational complexity is in a sense a special case of geometry? It’s a ‘landscape’ of some sort? Am I close? Once I put my mind to proving P≠NP , I’m pretty sure I can wrap it all up in a weekend or two Scott!

Incidentally, get out your red marker pen and check out this brief Quanta magazine article about the top-7 complexity classes:

https://www.quantamagazine.org/a-short-guide-to-hard-problems-20180716/

]]>Here is the paper https://arxiv.org/abs/1801.04043

]]>The Japanese Super-Kamiokande experiment told us that protons are more stable than predicted by some theory, but neutrons are definitely not perfectly stable. Most atoms will also be less stable than protons, even if they are extremely stable compared to neutrons. But I guess your point is that they still only decay into particles from which heavier atoms can be created again under realistic circumstances. If they would just decay into pure energy (+ some light particles like neutrinos), then there would be no way for atoms to still exist today.

On the other hand, the theory of inflation seems to imply that space and matter was created in a short time frame by an instability of some kind, which created things (while you worry about an instability destroying things).

Now “being almost exactly correct” certainly means being more correct than “12 decimal places”, and I don’t worry about that kind of accuracy. But an accuracy of “100 decimal places” would seem excessive to me (for non-QTF QM). And I doubt that such an accuracy (of QM) would even be relevant for quantum computing. But for discussing that question, it is important what is meant by QM and its accuracy.

]]>I am not an expert by any means (but I had a reasonably good mathematical and scientific education)

I’m not sure Scott wants a big discussion here, but if I could make a small point and maybe correct something I said above.

I like to think about the stability of atoms, since their formation in early universe events, that stability depends on the linearity (unitary evolution) of quantum mechanics being almost exactly correct.

So I do not really find that part of quantum mechanics questionable.

But when I said the Born Rule was the point of uncertainty for noise proponents I think I was incorrect to say that.

]]>Later, Ginsberg visited UT Austin and I had a very enjoyable lunch with him. He’s led an extremely interesting and varied life.

I confess that it’s basically impossible for me to read a novel that revolves around P vs. NP and get into it *as a novel*. Like, all I’m going to be doing is hovering over it with a red pen, looking for cringey aspects of the CS exposition. For this reason, I’m one of the worst people in the world to try to evaluate such a thing as fiction … so I won’t.

I’ll mention, though, that when I had lunch with Ginsberg, I learned that he’s almost theologically committed to the belief that P=NP—like, not just as a conceit for his novel, but for real—and nothing I said could budge him from that opinion. Needless to say, I take a different view.

But maybe I’d even go further. I think there’s genuine human drama in the quest to prove P≠NP: people throwing their minds against the ultimate limits of what could be calculated in the lifetime of the universe, trying (in some sense) to comprehend the space of all possible computational shortcuts so well as to be able to make the universal statement that not one of them can automate all mathematical creativity, and thereby to complete the unfinished business of Gödel and Turing from the dawn of the digital age. It wouldn’t be easy to write an engaging novel about this drama, but maybe Rebecca Goldstein or one or two others could do it. A P=NP novel, you might say, is the *easier* way out: the way with a deus ex machina, a whiz-bang conceit that then lets you tell a fairly standard story about a reclusive genius in possession of a trillion-dollar secret who’s being hunted by sinister government agents. It reminds me of the many sci-fi writers who, rather than trying to grapple with the immense human implications of millennia-long journeys between the stars, simply postulate a warp drive.

If “Quantum Mechanics” is interpreted in the sense of Werner Heisenberg, then QFT shows that even linearity is not very precise, at least not for Maxwell equations in vacuum (where one can experimentally reach the intensities for which the non-linearity is non-neglegible). A case where the inaccuracy of non-QFT “Quantum Mechanics” might be easier to understand are composite bosons, like alpha particles, helium atoms, or Cooper pairs (of electrons). The amplitude for an exchange of an internal neutron (of two alpha particles or of two helium atoms) is just so extremely small (for the energies relevant in chemistry and solid state physics) that those bosons internally composed of (an even number of) fermions simply are perfect bosons, in that context. (Ironically, even so this inaccuracy is easier to understand in theory, it is not an inaccuracy in practice.)

If “Quantum Mechanics” is interpreted in the sense of a generalized probability theory, then it becomes harder to understand what exactly is meant by its accuracy. If one focuses on the Hilbert space aspect, then it probably means that all the unitary transformations or orthonormal bases one might want to use can be considered to be perfectly unitary (and linear) or perfectly orthonormal, without introducing any relevant inaccuracies. (So if you find an unitary transformation one might want to use for some quantum algorithm for which the inaccuracies are relevant, then you would have refuted the assumption of accuracy. But since quantum computing is typically done at low energies and low temperatures, you must assume even lower energies than for Heisenberg’s interpretation.)

]]>(I was way too bothered by it mixing up key size and level of security, thus claiming that 128-bit numbers are hard to factor…)

]]>Of course, as I never tire of pointing out, that position doesn’t yield any argument against moving full speed ahead with QC research, quite the opposite in fact.

]]>OK, then if (and when) QC fails, perhaps getting a precise answer to the Sure-Shor separator problem will replace unification of gravity and QFT as the most important problem in physics. And all of the failed quantum computers will become the instruments to experiment with this problem.

]]>