### Research (by others) proceeds apace

Wednesday, January 27th, 2021At age 39, I already feel more often than not like a washed-up has-been in complexity theory and quantum computing research. It’s not intelligence that I feel like I’ve lost, so much as two other necessary ingredients: *burning motivation* and *time*. But all is not lost: I still have students and postdocs to guide and inspire! I still have the people who email me every day—journalists, high-school kids, colleagues—asking this and that! Finally, I still have this blog, with which to talk about all the exciting research that *others* are doing!

Speaking of blogging about research: I know I ought to do more of it, so let me start right now.

- Last night, Renou et al. posted a striking paper on the arXiv entitled Quantum physics needs complex numbers. One’s immediate reaction to the title might be “well duh … who ever thought it didn’t?” (See this post of mine for a survey of explanations for why quantum mechanics “should have” involved complex numbers.) Renou et al., however, are interested in ruling out a subtler possibility: namely, that our universe is secretly based on a version of quantum mechanics with real amplitudes only, and that it uses extra Hilbert space dimensions that we don’t see in order to
*simulate*complex quantum mechanics. Strictly speaking, such a possibility can*never*be ruled out, any more than one can rule out the possibility that the universe is a classical computer that simulates quantum mechanics. In the latter case, though, the whole point of Bell’s Theorem is to show that*if*the universe is secretly classical, then it also needs to be radically nonlocal (relying on faster-than-light communication to coordinate measurement outcomes). Renou et al. claim to show something analogous about real quantum mechanics: there’s an experiment—as it happens, one involving three players and two entangled pairs—for which conventional QM predicts an outcome that can’t be explained using any variant of QM that’s both local and secretly based on real amplitudes. Their experiment seems eminently doable, and I imagine it will be done in short order.

- A bunch of people from PsiQuantum posted a paper on the arXiv introducing “fusion-based quantum computation” (FBQC), a variant of measurement-based quantum computation (MBQC) and apparently a new approach to fault-tolerance, which the authors say can handle a ~10% rate of lost photons. PsiQuantum is the large, Palo-Alto-based startup trying to build scalable quantum computers based on photonics. They’ve been notoriously secretive, to the point of not having a website. I’m delighted that they’re sharing details of the sort of thing they hope to build; I hope and expect that the FBQC proposal will be evaluated by people more qualified than me.

- Since this is already on social media: apparently, Marc Lackenby from Oxford will be giving a Zoom talk at UC Davis next week, about a quasipolynomial-time algorithm to decide whether a given knot is the unknot. A preprint doesn’t seem to be available yet, but this is a big deal if correct, on par with Babai’s quasipolynomial-time algorithm for graph isomorphism from four years ago (see this post). I can’t wait to see details! (Not that I’ll understand them well.)