My Y Combinator podcast

Here it is, recorded last week at Y Combinator’s office in San Francisco.  For regular readers of this blog, there will be a few things that are new—research projects I’ve been working on this year—and many things that are old.  Hope you enjoy it!  Thanks so much to Craig Cannon of Y Combinator for inviting me.

Associated with the podcast, Hacker News will be doing an AMA with me later today.  I’ll post a link to that when it’s available.  Update: here it is.

I’m at STOC’2018 TheoryFest in Los Angeles right now, where theoretical computer scientists celebrated the 50th anniversary of the conference that in some sense was the birthplace of the P vs. NP problem.  (Two participants in the very first STOC in 1969, Richard Karp and Allan Borodin, were on a panel to share their memories, along with Ronitt Rubinfeld and Avrim Blum, who joined the action in the 1980s.)  There’s been a great program this year—if you’d like to ask me about it, maybe do so in the comments of this post rather than in the AMA.

32 Responses to “My Y Combinator podcast”

  1. Craig Cannon Says:

    Thanks for making time Scott! Great meeting you after running into your blog so often 🙂

  2. TR Says:

    I’m also attending STOC and it is great. Still, one cannot help thinking that 30-40-50 years ago was a golden age different from ours. If they were Renaissance Florence, are we doing at least cubism/expressionism or had we reached the Andy Warhol/Damien Hirst stage already?

  3. Douglas Knight Says:

    Can we ask questions here instead?

    It is almost tautological that quantum computers can compute chemistry. Is it plausible to design a robust quantum computer that would compute chemistry, even if it turns out that quantum computers can’t compute BQP?

    In other words, if the noise conspires to prevent quantum computers from factoring, then it also conspires to make chemistry easier, but can we exploit that without knowing what the conspiracy actually is? Of course, we’d like to know how the noise actually behaves, but we’d also like to make powerful computers without doing the basic science of understanding why quantum computers don’t work the way we expect.

  4. adviceneeded Says:

    Breakthroughs on near horizon?

  5. adviceneeded Says:

    What does ‘ if the noise conspires to prevent quantum computers from factoring, then it also conspires to make chemistry easier’ mean?

  6. Scott Says:

    TR #2: I don’t know enough about art history to answer your question! But one obvious disanalogy between the cases is that, with painting, people could achieve arbitrary degrees of photorealism and so forth hundreds of years ago, so then there was no place to go except to explore new realms (cubism, Impressionism, etc.) primarily because they were new. Whereas in TCS, there were many questions raised near the beginning that we still don’t have the answers to: P vs NP, the exponent of matrix multiplication, you name it. As long as those questions are open, I feel like TCS will remain alive.

    Of course I wasn’t born for STOC’71, so I can’t speak to the experience of hearing Cook first publicly articulate the P vs NP problem and the concept of NP-completeness, and compare it to attending STOC now. But, like, were you at the talk where they proved the first simply exponential upper bound on the number of stable matchings, settling one of Knuth’s problems from the 1970s? Or the talk that settled the complexity of the art gallery problem (speaking of art 🙂 )? Or Murray and Williams’ separation between NQP and ACC? Or Sanjeev Arora’s tutorial talk where he discussed his proposed solution to the “overfitting paradox” of deep learning? I actually felt like there was some pretty awesome stuff this year.

  7. Scott Says:

    Douglas #3: I completely agree with you that, if quantum computers are fundamentally impossible, then Nature itself is easier to simulate than we thought, so in any case it’s almost tautological that chemistry can be efficiently simulated on a computer. (I say “almost” because we’re still assuming the possibility of a single, universal simulating machine, rather than an infinite collection of different machines—something that’s true for classical computing, quantum computing, and other “reasonable” models of computation, but that might not be true for some arbitrary intermediate model.)

    But the whole enterprise of simulating chemistry on a computer, involves mapping one system to another system with completely different underlying physics, with the only tether between the two being a mathematical isomorphism rooted in our understanding of physical law. So it sort of beggars belief to me that one could make this work absent a scientific understanding of what the underlying rules are.

  8. John Sidles Says:

    In regard to the efficient quantum simulation of condensed matter physics, the Simons Foundation’s Flatiron Institute presently is advertising three full-time cutting-edge research positions at their Center for Computational Quantum Physics (CCQ, description here), as well as multiple research positions relating to computational biology.

    Best wishes are extended to any and all Shtetl Optimized readers who apply! 🙂

    Also worth a look, the lectures for the recent Simons Foundation workshop “Challenges in Quantum Computation” (June 11-15) are now on-line; particularly commended are John Martinis’ “Quantum Supremacy: Checking a Quantum Computer with a Classical Supercomputer” and Scott’s own lecture “Quantum Supremacy and its Applications” (lectures here).

    A unifying take-home message (for me) is that the mathematically natural pullback of noisy QED systems onto low-dimension algebraic varieties yields computational simulations that are accurate and efficient. Here “noisy QED systems” encompasses any system that continuously leaks information, via photons and/or phonons, into a vacuum and/or event horizon and/or water bath.

    In practice (and needless to say) the class of noisy QED systems encompasses pretty much all experimentally accessible (to date anyway) condensed matter systems, including all biological systems. Perhaps Google’s qubit devices (the ones that John Martinis’ lecture describes) will prove to be an exception. Or perhaps not! 🙂

  9. William Hird Says:

    @John #8

    Let’s see, Levin , Goldreich and Kalai all think that QC’s will not work, seems like a pretty strong team to me ! If it turns out that they are right, all is not lost, QC’s will make perfect random number generators 🙂

  10. Rational Feed – deluks917 Says:

    […] Y Combinator Interview by Scott Aaronson – Scott is interviewed by Y combinator: Quantum Computers, P vs NP, how to channel AI growth but not weaponize it, career advice. […]

  11. mjgeddes Says:

    I don’t agree with you on Twitter Scott. It’s true there’s roving ‘outrage mobs’, but you can totally control what appears in your feed (and in the event of being targeted by a mob, one can simply make their profile private for a couple of months until it blows over).

    I’ve actually found Twitter a great news resources – you can rapidly access all sorts of interesting snippets regarding what’s happening in the world. For example , you can make up lists (‘feeds’) on specific topics and really keep up-to-update with loads of interesting science and tech stuff. I just want to give you a sample, by linking to a couple of my Twitter news feeds. Have a look:

    Sci/Tech Feed:
    https://twitter.com/zarzuelazen/lists/sci-techfeed

    TechBiz Feed:
    https://twitter.com/zarzuelazen/lists/techbizfeed

    Crypto-currency Feed:
    https://twitter.com/zarzuelazen/lists/crypto-currrencyfeed

    AI Feed:
    https://twitter.com/zarzuelazen/lists/aifeed

    Besides , all the top rationalists are regularly tweeting all sorts of interesting tidbits…legends such as Hanson, Goertzel, Sandberg etc.

  12. Craig Says:

    I don’t understand this: https://www.zmescience.com/science/news-science/quantum-entanglement-record-0432/

    How is 18 qubits entanglement a world record? Doesn’t IBM already have a working 20 qubit machine?

  13. fred Says:

    Douglas #3
    “if the noise conspires to prevent quantum computers from factoring, then it also conspires to make chemistry easier, but can we exploit that without knowing what the conspiracy actually is?”

    it’s like saying that you can stack three hundred billion pennies to reach the moon (nothing in the laws of physics prevent this, but it’s a hard engineering feat) because …. you can also throw the same three hundred billion pennies in a random pile, and that pile would just be as hard to reproduce *exactly* as the neat stack… so the two are equivalent.

  14. Craig Says:

    Doug 3 and Scott 7,

    If QM were only true to say 12 decimal places but after the 12th decimal place the rest of the decimals are random, then how would things look? I would think that simulating things on a digital computer would be easy, except for some situations, you might get different answers due to the randomness of the rest of the decimals, right?

  15. Scott Says:

    Craig #12: I don’t know. Can’t you give me a link to the actual paper? I despise having to decode popular articles that claim an experimental breakthrough without answering the duh-obvious questions about it, and people asking me to do that is one of the commonest and least favorite occurrences of my life (not that this comment will stop anyone).

    FWIW, my guess would be that 18 qubits is indeed some sort of record for photonics. My guess would also be that larger entangled states have been made with superconducting qubits, trapped ions, optical lattices, and maybe other systems, but on the other hand, there might not have been a measurement of an entanglement witness to directly demonstrate it (was there in this case?).

  16. Scott Says:

    Craig #14: The idea that QM is only “accurate to 12 decimal places” has long been (trivially) ruled out by experiment. If you polarize 1000 photons each at 45 degrees, you’ll get amplitudes of order 2-500 each, but we know the predictions of QM are correct for such systems—or even for superpositions of several such systems (as shown by e.g. the buckyball double-slit experiment). You may say “that doesn’t count; because those systems either don’t have entanglement or else have very limited forms of entanglement.” If you did, though, that would be an after-the-fact objection, not (clearly) something you thought of at the time. Which brings up a deeper point: these examples already suffice to show that the “Sure/Shor separator,” as I called it way back in 2003, if it exists, can’t possibly be anything as simple as precision in amplitudes. It would have to be some more complicated criterion related to the “complexity” of the state—something that we don’t have any workable example of, but that nevertheless, the experimentalists will be all too happy to try to rule out in the coming years.

  17. Craig Says:

    Scott 16,

    For the polarization of photons at 45 degrees, it depends on which basis one chooses as to whether the amplitude is exponentially small or not. If one chooses to measure the photons in the diagonal direction, one gets an amplitude of one. I would guess that this is the criterion for the Shore-Sure separator, but I am probably wrong, given that you have thought about this more and did not find a clear answer.

  18. Scott Says:

    Craig #17: It’s easy to give examples of many-particle entangled states for which we have excellent experimental evidence—e.g., “cluster states” of the sort that occur in optical lattices and also in condensed-matter systems—for which there’s no local change of basis that avoids exponentially small amplitudes.

  19. James Gallagher Says:

    It’s not very precise to talk about Quantum Mechanics being precise to x number of decimal places. What is not precise? The linearity is considered almost exactly precise, after many experiments.

    Things that are considered a possible source of imprecision are; the accuracy of the Born Rule, the possibility of discreteness in time evolution, the possibility of discreteness in space.

    Only the first two of these would impact QC.

    The first one, accuracy of the Born rule is essentially what the Quantum Noise proponents are suggesting is uncertain.

    The second one, discreteness of time evolution, has no conclusive experimental evidence for or against, but it would also impact on QCs

    The third one, discreteness of space is not really relevant to QC, since quantum mechanics doesn’t operate in real “space”, it operates in a complex-valued Hilbert Space, and whether we have six small dimensions and three large or whatever consistent variants there are, does not matter

  20. Yaqub Says:

    Hi Scott, is it logically possible for P≠NP even if time travel is possible?
    Will you consider doing a book review of Supernormal by Dean Radin which covers parapsychology research including precognition. I figured computational complexity should help make an assessment of the astonishing claims made in it.

  21. Craig Says:

    I looked up cluster states. It seems that all of the basis vectors have amplitudes of the same magnitude, exponentially small, except some are negative and some are positive. But how can anyone be able to know this? If one measures the state many times, one would observe uniform randomness. One would have to apply unitary transforms to these states to tell that some amplitudes are positive and some are negative. Have they done this?

  22. Scott Says:

    Yaqub #20: Yes, P≠NP is a mathematical conjecture, which could hold regardless of what’s true about the physical world. People constantly confuse it with a related but different conjecture, namely that NP-complete problems require superpolynomial resources in the physical world. The latter is the conjecture that would be seriously threatened by the existence of closed timelike curves: for more, see the time travel section of Why Philosophers Should Care About Computational Complexity.

    Sorry, but I don’t have any great interest in reading books on parapsychology right now. I liked the take in this Slate article: namely, the fact that researchers like Daryl Bem were able to get “strong evidence for ESP” while following all the correct statistical procedures of psychology, was actually an early hint of the replication crisis that we now know to infect thousands of even mainstream psychology studies. I.e., the thing that was broken was not the known laws of physics, but rather the norms for conducting and publishing psychology studies.

  23. Scott Says:

    Craig #21: Yes, of course you don’t just measure the states in the computational basis; you measure in other bases (i.e. “apply a unitary transform first”). And yes, the relevant experiments have been done, on cluster-state-like condensed matter systems—so, not systems where one has the amount of control that one hopes to have with a universal QC, but the results (e.g., measurements of magnetic susceptibility) were consistent with the hypothesis of cluster-state-like entanglement, and ruled out various physically reasonable hypotheses where one wouldn’t have such entanglement. See for example the 2003 Nature paper by Ghosh et al.

  24. Craig Says:

    Scott 23,

    OK, then if (and when) QC fails, perhaps getting a precise answer to the Sure-Shor separator problem will replace unification of gravity and QFT as the most important problem in physics. And all of the failed quantum computers will become the instruments to experiment with this problem.

  25. Scott Says:

    Craig #24: That would be great!

    Of course, as I never tire of pointing out, that position doesn’t yield any argument against moving full speed ahead with QC research, quite the opposite in fact.

  26. Conrado Says:

    Have you read Factor Man? (Since you’re a character in it…) What did you think about it?

    (I was way too bothered by it mixing up key size and level of security, thus claiming that 128-bit numbers are hard to factor…)

  27. gentzen Says:

    James Gallagher #19: Discussions about “Quantum Mechanics being precise to x number of decimal places” between a naive person initiating the discussion and an expert feel somehow strange to me. At least for me, “Quantum Mechanics” could mean different things here. For Scott Aaronson and other computer scientists, it essentially means that classical probability theory gets replaced with a generalized probability theory allowing negative probabilities or complex amplitudes. For Werner Heisenberg, it essentially meant the well worked out part of quantum physics including Dirac’s equation, chemistry and solid state physics, but excluding quantum field theory. For a modern physicist, it could mean the experimentally well verified part of quantum physics, including quantum field theory, the standard model, superconductivity, and all of the older applications, but excluding string theory, loop quantum gravity, and even the relatively well understood QFT in curved spacetime.

    If “Quantum Mechanics” is interpreted in the sense of Werner Heisenberg, then QFT shows that even linearity is not very precise, at least not for Maxwell equations in vacuum (where one can experimentally reach the intensities for which the non-linearity is non-neglegible). A case where the inaccuracy of non-QFT “Quantum Mechanics” might be easier to understand are composite bosons, like alpha particles, helium atoms, or Cooper pairs (of electrons). The amplitude for an exchange of an internal neutron (of two alpha particles or of two helium atoms) is just so extremely small (for the energies relevant in chemistry and solid state physics) that those bosons internally composed of (an even number of) fermions simply are perfect bosons, in that context. (Ironically, even so this inaccuracy is easier to understand in theory, it is not an inaccuracy in practice.)

    If “Quantum Mechanics” is interpreted in the sense of a generalized probability theory, then it becomes harder to understand what exactly is meant by its accuracy. If one focuses on the Hilbert space aspect, then it probably means that all the unitary transformations or orthonormal bases one might want to use can be considered to be perfectly unitary (and linear) or perfectly orthonormal, without introducing any relevant inaccuracies. (So if you find an unitary transformation one might want to use for some quantum algorithm for which the inaccuracies are relevant, then you would have refuted the assumption of accuracy. But since quantum computing is typically done at low energies and low temperatures, you must assume even lower energies than for Heisenberg’s interpretation.)

  28. Scott Says:

    Conrado #26: I didn’t read the whole novel, but the author (Matt Ginsberg) sent me a draft to make sure that I was OK with being used as a character, so I did read the parts that involved me. I basically shrugged and said sure, it’s fine, others have done a lot worse with my name! 🙂

    Later, Ginsberg visited UT Austin and I had a very enjoyable lunch with him. He’s led an extremely interesting and varied life.

    I confess that it’s basically impossible for me to read a novel that revolves around P vs. NP and get into it as a novel. Like, all I’m going to be doing is hovering over it with a red pen, looking for cringey aspects of the CS exposition. For this reason, I’m one of the worst people in the world to try to evaluate such a thing as fiction … so I won’t.

    I’ll mention, though, that when I had lunch with Ginsberg, I learned that he’s almost theologically committed to the belief that P=NP—like, not just as a conceit for his novel, but for real—and nothing I said could budge him from that opinion. Needless to say, I take a different view.

    But maybe I’d even go further. I think there’s genuine human drama in the quest to prove P≠NP: people throwing their minds against the ultimate limits of what could be calculated in the lifetime of the universe, trying (in some sense) to comprehend the space of all possible computational shortcuts so well as to be able to make the universal statement that not one of them can automate all mathematical creativity, and thereby to complete the unfinished business of Gödel and Turing from the dawn of the digital age. It wouldn’t be easy to write an engaging novel about this drama, but maybe Rebecca Goldstein or one or two others could do it. A P=NP novel, you might say, is the easier way out: the way with a deus ex machina, a whiz-bang conceit that then lets you tell a fairly standard story about a reclusive genius in possession of a trillion-dollar secret who’s being hunted by sinister government agents. It reminds me of the many sci-fi writers who, rather than trying to grapple with the immense human implications of millennia-long journeys between the stars, simply postulate a warp drive.

  29. James Gallagher Says:

    Hi gentzen #27

    I am not an expert by any means (but I had a reasonably good mathematical and scientific education)

    I’m not sure Scott wants a big discussion here, but if I could make a small point and maybe correct something I said above.

    I like to think about the stability of atoms, since their formation in early universe events, that stability depends on the linearity (unitary evolution) of quantum mechanics being almost exactly correct.

    So I do not really find that part of quantum mechanics questionable.

    But when I said the Born Rule was the point of uncertainty for noise proponents I think I was incorrect to say that.

  30. gentzen Says:

    James Gallagher #29: I guess Scott doesn’t mind discussions in the comments. And you won’t get a big discussion with me anyway, I am simply not such a prolific writer and have much too long response times for the typical online discussion. By the way, I am no real physics expert either. However, I would still like to understand what people have in mind, when they use certain words like QM or accuracy.

    The Japanese Super-Kamiokande experiment told us that protons are more stable than predicted by some theory, but neutrons are definitely not perfectly stable. Most atoms will also be less stable than protons, even if they are extremely stable compared to neutrons. But I guess your point is that they still only decay into particles from which heavier atoms can be created again under realistic circumstances. If they would just decay into pure energy (+ some light particles like neutrinos), then there would be no way for atoms to still exist today.

    On the other hand, the theory of inflation seems to imply that space and matter was created in a short time frame by an instability of some kind, which created things (while you worry about an instability destroying things).

    Now “being almost exactly correct” certainly means being more correct than “12 decimal places”, and I don’t worry about that kind of accuracy. But an accuracy of “100 decimal places” would seem excessive to me (for non-QTF QM). And I doubt that such an accuracy (of QM) would even be relevant for quantum computing. But for discussing that question, it is important what is meant by QM and its accuracy.

  31. Craig Says:

    Scott 15,

    Here is the paper https://arxiv.org/abs/1801.04043

  32. mjgeddes Says:

    How about I write a novel where the characters produce a correct proof that P≠NP, and that proof is actually given in the book?

    Intuitively, P≠NP seems true. If P=NP then it seems you could indeed in effect have a fixed set of equations that ‘solve’ creativity, thus reducing cognition to pure mathematics.

    P≠NP seems to imply the existence of what I call the ‘cognitive arrow of time’ , whereby creativity can’t be reduced to fixed equations, but always needs imperfect heuristics that can evolve. This makes cognition distinct from pure mathematics (heuristics vs. fixed equations).

    Applying my famous super-human intuition, I guess that the proof is closely related to geometry …. computational complexity is in a sense a special case of geometry? It’s a ‘landscape’ of some sort? Am I close? Once I put my mind to proving P≠NP , I’m pretty sure I can wrap it all up in a weekend or two Scott!

    Incidentally, get out your red marker pen and check out this brief Quanta magazine article about the top-7 complexity classes:

    https://www.quantamagazine.org/a-short-guide-to-hard-problems-20180716/