Ten updates

If you like quantum, complexity, etc., then please read to the end! I’ve gotten a bunch of emails lately of the form “why haven’t you ever blogged about such-and-such?,” when it turned out that I damn well did blog about it; it was just somewhere down in a multi-item post.

1. Like many of you, I watched the US midterm election results with (mostly…) disappointment and dismay.  I think that history will judge us harshly for not totally and unequivocally rebuking everything Trump stands for and every politician associated with him.  But that’s not what I wanted to blog about today.

2. There was a breakthrough in communication complexity by Arkadev Chattopadhyay, Nikhil Mande, and Suhail Sherif: the first exponential separation between randomized communication complexity and log approximate rank for a total Boolean function f.  This falsifies the longstanding conjecture that these measures are polynomially related (though it doesn’t resolve the original log rank conjecture).  For those of you keeping score at home, the quantum communication complexity of f is sandwiched in between randomized CC and log approximate rank.  So, at least one of the following must now be true: either randomized CC is exponentially separated from quantum CC, or else quantum CC is exponentially separated from log approximate rank.  My money’s on the latter.

3. Ewin Tang, who achieved fame with a quantum-inspired classical algorithm for recommendation systems (which I blogged about in July), is now back with quantum-inspired classical algorithms for principal component analysis and supervised clustering.  Well, with the announcements of such algorithms; details of the analysis are to come later.

4. A bunch of people asked me about the paper by Sergey Bravyi, David Gosset, and Robert Koenig, Quantum advantage with shallow circuits.  tl;dr: it’s great!  And it was deservedly a highlight of the QIP conference back in January!  That’s why it confused me when everyone started asking about it a couple weeks ago.  The resolution is that the paper was just recently published in Science magazine, which led to popular coverage like this, which in turn led to people asking me whether this result unconditionally proves P≠BQP (that is, quantum computers can solve more problems in polynomial time than classical computers), and if not why not.  The answer is no: the paper proves an unconditional separation, but one that’s a long long way from P≠BQP, or anything else that would entail solving the central open problems of complexity theory like P vs. PSPACE.  Basically, it shows there are problems solvable in constant time with a quantum computer that aren’t solvable in constant time classically, for suitable meanings of “problem” (namely, a relation problem) and “in constant time” (namely, NC0 circuits, in which each output bit depends on only a constant number of input bits).  I understand that a stronger separation has since been achieved, between quantum NC0 and classical AC0, in work that’s not yet on the arXiv.  The problems in question, however, are all easy to solve in P, or even in classical logarithmic time, given a polynomial number of parallel processors.

5. A bunch of people also asked me about the paper by Xun Gao and Luming Duan, Efficient classical simulation of noisy quantum computation.  This paper tries to formalize something that many of us have suspected/feared for years, namely that random quantum circuits (the whole thing is specific to random circuits) can tolerate only a tiny amount of noise and decoherence before they become efficiently simulable classically.  If true, this has obvious implications for the sampling-based quantum supremacy experiments that Google and others are planning for the next few years: namely, that all the engineering effort they’ve already been investing anyway to push down the noise rate, will actually be necessary!  However, correspondence with the authors revealed that there’s a significant gap in the analysis as it currently stands: namely, the current proof applies only to closed quantum systems, which would (for example) rule out all the techniques that people eventually hope to use to achieve quantum fault-tolerance—all of which are based on constantly measuring subsets of the qubits, doing essentially error-free classical computation on the measurement outcomes, throwing away noisy qubits, and pumping in fresh qubits.  Xun and Duan say that they’re currently working on an extension to open systems; in my personal view, such an extension seems essential for this interesting result to have the informal interpretation that the authors want.

6. My friend Bram Cohen asked me to announce that his company, Chia, has launched a competition for best implementation of its Verifiable Delay Functions (VDFs), with real money rewards.  You can find the details at this Github page.

7. The second Q2B (“Quantum 2 Business”) conference, organized by QC Ware Corp., will be held this coming December 10-12, at the Computer History Museum in Mountain View.  There will be two keynote addresses, one by John Preskill and the other by me.  I hope I’ll get a chance to meet some of you there!

8. Longtime colleague and friend-of-the-blog Ashwin Nayak asked me to announce that the 2019 Conference on Computational Complexity, to be held July 18-20 in exciting New Brunswick, NJ, is now accepting submissions.  I hope to be there!

9. OK, what the hell: the 21st annual, and nearly impossible to capitalize correctly, SQuInT (Southwest Quantum Information and Technology) workshop will be held February 2019 in Albuquerque, NM.  UT Austin is now a node of the SQuInT network, and I’ll hopefully be attending along with a posse of students and postdocs.  The deadline for abstract submission is coming up soon: Monday November 12!

10. I went to morning Shabbat services in Austin this past weekend, exactly one week after the tragedy in Pittsburgh.  There was massively increased security, with armed guards interrogating us, Israeli-style, about why we had no membership sticker on our car, whether we knew the name of the rabbi, etc.  Attendance was maybe a factor of three higher than usual.  About thirty priests, ministers, and Christian seminary students, and one Muslim, came up to the bima to say a prayer of solidarity with Jews.  The mayor of Austin, Steve Adler, was also on hand to give a speech.  Then the rabbi read a letter to the synagogue by Sen. Ted Cruz denouncing antisemitism (well, parts of it; he said the letter was really long).  There were murmurs of disapproval from the crowd when Cruz’s name was mentioned, but then everyone quieted down and listened.  The thing is, the US and large parts of the world are now so far outside the norms of liberal democracy, in territory so terrifyingly uncharted since the end of World War II, that shooting up synagogues is bad is actually something that it’s better than not for powerful people to affirm explicitly.  Anyway, while I’m neither a believer nor much of a synagogue-goer, I found the show of unity moving.

98 Responses to “Ten updates”

  1. Geoffrey Irving Says:

    For (5), how much of that is similar to “graph automorphism is polynomial time on random graphs” in the sense that it’s about how easy random circuits are rather than how weak quantum computation is in general?

  2. Scott Says:

    Geoffrey #1: I don’t think we know yet. If the Gao-Duan result can be extended to open systems (i.e., circuits with intermediate measurements), then I think it really will have the implication that simulating a noisy quantum system is “easy on average (under the uniform distribution over circuits) even though it’s hard in the worst case.” Of course, this would only apply to noisy circuits. For noiseless random circuits, results over the last few years by Lijie Chen and myself, Bouland-Fefferman-Nirkhe-Vazirani, and others have given pretty strong evidence that they’re just as hard to simulate classically as you’d naïvely expect.

    If, on the contrary, the Gao-Duan result can’t be extended to open systems, then I’d say that it’s mostly just telling us about the weakness of noisy closed systems, analogous to this much earlier result by Aharonov et al.

  3. Edan Maor Says:

    RE: 10, what was the security like before that? I’ve been to my wife’s synagogue in Argentina, and there is pretty solid security there too, at least on the special occasions I’ve been there (including concrete blocks in front of the building to prevent cars coming close, and guards along half the street.) In Argentina it’s a reaction to a huge synagogue bombing that happened in 1998, I believe.

  4. Scott Says:

    Edan #3: Before they’d stop your car at the gate, but if they saw that you were, e.g., a family with kids, they’d just wave you past.

  5. Randall K McRee Says:

    The current political climate reminds me of a rule that my wife and I came up with concerning fights: silence means you agree with what was just said. Silence means “yes”. This forces you to respond to the other person’s point. Well, it seems like Trump stays silent when folks are gunned down. Wrong on both sides, Mr. Trump!?

    Yes, we need, all of us, to state the obvious, since our leaders will not–we condemn this violence. Silence at this juncture means you agree with the violence!

  6. Bram Cohen Says:

    To further plug the Chia competition: We’ve got class groups! $100,000 in prize money! Algorithms similar to factoring! Mathematical code optimization!

  7. Jelmer Renema Says:

    Scott, can you elaborate on why it’s so important for the Gao-Duan result to extend to open systems? I always thought that one of the key selling points of quantum advantage demonstrations was that you might hope to do something that didn’t involve error correction, thereby drastically cutting back on the number of physical qubits needed as compared to a scheme that implements error correction. Even if the Gao & Duan result doesn’t generalize to situations where you’re allowed to do error correction, surely there is some importance in the fact that it shows that you need error correction in order to do anything non-simulable, thereby pushing up the number of qubits needed for a QA demonstration by orders of magnitude?

  8. Think Says:

    ‘Like many of you, I watched the US midterm election results with (mostly…) disappointment and dismay. I think that history will judge us harshly for not totally and unequivocally rebuking everything Trump stands for and every politician associated with him. But that’s not what I wanted to blog about today.’. This is what I thought. With all the blue wave only close to 40 flips? The crucial states look reddish. 2020 will be same?

  9. Scott Says:

    Jelmer #7: Because if you read their paper, they talk repeatedly about how quantum fault-tolerance is able to evade their result only because of its use of extremely non-random circuits. But I don’t see how to justify that interpretation without an extension of their result to open systems. If the result didn’t extend in that way, then it might have nothing to do with the randomness of the circuit (and so, e.g., with sampling-based quantum supremacy compared to any other kind of quantum computation); really it might just be about the system being closed. Of course another relevant question here is whether one could extend their result to arbitrary quantum circuits with a constant rate of noise and no other interaction with the environment (the Aharonov et al paper did something in that direction back in 1996, but they assumed a constant rate of noise per qubit per time step, rather than per gate).

    We’ve understood for years that, if you want to get quantum supremacy without fault-tolerance (and, say, 50-70 qubits), then it’s going to be an incredibly demanding numbers game, and there’s no reason to expect the QC to be able to recover from even a small number of errors. So, even if the Gao-Duan result can’t be extended in the above ways, yes, it helps to support the working assumption that Google and the others have been operating under.

  10. Dan Staley Says:

    How dismayed should we really be about the election? Of the races whose outcomes are decided (as of me writing this comment), Democrats won 225 out of 422 house races and 23 out of 32 Senate races. State-level Democrats and progressive ballot measures also did quite well.

    Sure, they didn’t win the Senate, but that’s more to do with the fact that so few of the seats up for re-election were Republicans. I don’t think history can just us too harshly here.

  11. Scott Says:

    Dan #10: Conditioned on living in the sort of world where Trump could become president in the first place, Democrats did OK. Relative to normal expectations for a liberal democracy, it’s horrible.

  12. Jay Says:

    #11 Relative to normal expectations for a liberal democracy, it’s horrible.

    smart dude: Well said!

    snark dude: Maybe it’s about time we update these expectations?

  13. Job Says:

    This paper tries to formalize something that many of us have suspected/feared for years, namely that random quantum circuits (the whole thing is specific to random circuits) can tolerate only a tiny amount of noise and decoherence before they become efficiently simulable classically.

    It’s interesting (but not surprising in this context) that noise can actually make simulation easier. From a classical perspective that’s really counter intuitive.

    It suggests that it’s easier to lower the complexity of a quantum circuit via random mutations than it is to increase it. You have to carefully craft a difficult circuit, and it’s easier to tear it down?

    On the other hand, if we take an instance of an NP-hard problem and randomly mutate it, it’s not going to get any easier, unless the noise is not uniformly distributed (in which case it might). Why is it different with random quantum circuits?

  14. Scott Says:

    Job #13: “Mutating” a quantum circuit C—in the sense of randomly changing a few of the gates to get a new circuit C’—is not the same thing as adding noise to C. In particular, the mutated circuit C’ would still map pure states to pure states, whereas a noisy circuit will map pure states to mixed states. Another way to express the difference is that with “mutating,” you make a small random change to C, but one that’s specific and known, whereas “adding noise” means averaging over many different small changes to C.

    The intuitive reason why adding noise makes a (random) quantum circuit easier to simulate classically, is that the noise averages out all the bumps and wiggles in the output distribution, and causes it to converge to just the uniform distribution. Indeed, that’s exactly what Gao and Duan prove; that’s how their result works (at least their Theorem 1).

    By contrast, I indeed wouldn’t expect “mutating” to usually make quantum simulation problems easy, just like it doesn’t usually make NP-complete problems easy—in both cases, mutating would only make the problem easy if the hard instances were extremely special and rare.

  15. Vincent Says:

    I started reading the Quantum Inspired Classical Algorithm paper and got confused over the following sentence in the introduction: “We, therefore, urge caution when claiming exponential speedups over
    classical algorithms.”

    The confusion is this: did the quantum algorithm that the paper dequantifies claim an exponential speed-up over the then-best classical algorithm?

    If yes, then the current paper presents a classical algorithm that gives an exponential speed up over the fastest classical algorithm used today. In a world where user-friendly quantum computers are still a distant dream, but thousands of people are using classical computers to do PCA on bigger and bigger datasets every day this would be HUGE. I would expect this exponential speed-up within the classical world to be the focus on the paper rather than the ‘quantum inspired’ part.

    But on the other hand, if the authors of the quantum algorithm did NOT claim an exponential speed-up over the best known classical algorithm, then why is the above quoted sentence in the paper in the first place?

    A shorter version of my question is: how does the new classical algorithm presented here compare to the existing classical algorithms? It seems that this is not directly addressed in the paper (but perhaps I am overlooking it).

  16. Vincent Says:

    A stupid technical question: I just wrote a comment here regarding your point 3. Now the previous time I commented here (some months ago) after submitting my comment appeared visible only to me with the additional info ‘your comment is awaiting moderation’. This time (5 minutes ago) after submitting the comment just disappeared. Is this due to some updates in the commenting system that were made in the meantime, or did something go wrong and you never received my previous comment?

  17. Job Says:

    I forgot that quantum noise does more than add random gates, it also adds more qubits.

    That would be analogous to mutating an instance of size n to one of size m, where m is arbitrarily larger than n.

    And i think in that case, on average, you would get an easier instance. At least I would expect that for an NP-Complete problem, as n goes up, the ratio of difficult instances (relative to some solver) goes down.

  18. Scott Says:

    Job #17: No, the type of noise we’re talking about does not add any more qubits. Please reread comment #14; the issue is what I explained there.

  19. Scott Says:

    Vincent #15: Yes, it’s not out of the question that these new classical algorithms could be useful! However, it’s crucial to understand that, while the new algorithms run in polylogarithmic time,

    (1) they only work for low-rank matrices,

    (2) they require the input to be preprocessed in a binary search tree data structure, and

    (3) they return only limited statistical information about the solution.

    Now, since the quantum algorithms have exactly these same limitations, the classical algorithms do indeed prove that the quantum algorithms aren’t getting exponential speedups! But these issues will somewhat limit the applicability — by how much, I’m not sure.

    I also don’t remember exactly what language the earlier papers used in presenting the quantum algorithms, but they at least would’ve claimed exponential speedups over any known classical algorithm (which would’ve been true at the time…)

  20. John Stricker Says:

    @ Randall K McRee #5

    You write “it seems like Trump stays silent when folks are gunned down”, and “we need, all of us, to state the obvious, since our leaders will not–we condemn this violence. Silence at this juncture means you agree with the violence!”.

    I am stunned.

    It is my suggestion that your hatred of Trump has, somehow, warped your mind. Have you even bothered to do a minimal amount of research?! Please allow me:

    Trump: Pittsburgh Synagogue Shooting ‘An Assault On Humanity’ NBC News:
    https://www.youtube.com/watch?v=IiDb-l_dBYg

    Pittsburgh rabbi details meeting with President Trump (CNN):
    https://www.youtube.com/watch?v=3FkW1zUGGrU#t=1m17s

    Now, I understand I am a guest on this blog; also I am familiar with the political position of our esteemed host. But I would like to implore everyone here: Please do not add to the already immense division of your country by believing false narratives, no matter how plausible they seem to you.

    Peace be with you all.

  21. Scott Says:

    Vincent #16: All comments go first into my moderation queue—it’s been that way for years, because of some bad experiences with unmoderated comments. I normally approve comments within a few hours when there’s a new post up, and within a few days otherwise. If I miss a comment, feel free to shoot me an email.

  22. Job Says:

    In particular, the mutated circuit C’ would still map pure states to pure states, whereas a noisy circuit will map pure states to mixed states.

    Is that because the noise is effectively measuring the state of the system?

    Isn’t noise basically entangling the system with the environment. If the environment is describable by a quantum circuit, then why isn’t the noisy circuit just a larger circuit?

  23. Scott Says:

    Job #22: Yes, the noise physically corresponds to interaction between the QC and additional degrees of freedom outside the QC. But crucially, we don’t keep track of those additional degrees of freedom—only the effect of the decoherence on the QC. If we did have to keep track of the additional degrees of freedom and their entanglement with the QC, the task of simulation would be much much harder, and I’m guessing that Gao and Duan’s result would be false. (Or if true, it would certainly require a more elaborate proof.)

  24. Job Says:

    But crucially, we don’t keep track of those additional degrees of freedom—only the effect of the decoherence on the QC.

    Ok i understand now.

    And it’s interesting that the quantum supremacy experiments rely on random circuits because now it requires error correction.

    How many years does this add to a quantum supremacy result? I’ve looked at some of the implementation details for error-correction and my impression was that it was still far way.

    Also, where does this put NISQ devices?

  25. Scott Says:

    John #20: Indeed, I don’t agree with the attempts to hold Trump responsible for this shooting, or for a supposed failure to denounce it—as you say, the facts support neither charge. (The fact that the murderer considered Trump an immigrant-coddling “Jewish puppet” is a chilling reminder that, as awful as things are right now, there are those who want them to be even worse.)

    I do hold Trump more responsible than any other individual for the terrifying breakdown in civil and democratic norms that the US is currently undergoing. He’s not solely responsible, there are even those on the SJW left who bear some responsibility, but they all seem like amateurs compared to him. And, while of course this issue predates the rise of Trump, I hold all Republican leaders morally responsible for their refusal to rein in the weapons that allowed this, the California shooting this morning, and so many other tragedies to happen.

    I don’t blame the relatives of the murder victims who chose to meet Trump, but nor do I blame those who chose not to.

  26. Scott Says:

    Job #24: As I said before, the efforts of Google and others, toward near-term quantum supremacy experiments based on sampling problems, were already proceeding on the assumption that without fault-tolerance, you can handle almost no errors in your circuit at all if you want to see a speedup. Indeed, heuristic considerations already made it clear years ago that the signal you’re looking for, in this type of experiment, will fall off like at least 1/exp(k) if you have (say) k bit-flip errors. So, while Google and others crunched the numbers and concluded that they had a shot, every informed person knew (and knows) that getting the physical error low enough with 50-70 integrated qubits would be incredibly hard, and that there was no guarantee of success. In that respect, the Gao-Duan result hasn’t changed the experimental outlook. Instead, it provides better theoretical grounding for the view that heroic engineering efforts like the ones now being undertaken are actually needed if you want near-term quantum supremacy.

  27. John Stricker Says:

    @ Scott #25

    Thank you for your response, much appreciated.

  28. Sniffnoy Says:

    I do hold Trump more responsible than any other individual for the terrifying breakdown in civil and democratic norms that the US is currently undergoing.

    Yeah? How about Newt Gingrich? :-/

  29. Job Says:

    In that respect, the Gao-Duan result hasn’t changed the experimental outlook. Instead, it provides better theoretical grounding for the view that heroic engineering efforts like the ones now being undertaken are actually needed if you want near-term quantum supremacy.

    But it pushes back what would otherwise be a compelling quantum supremacy experiment, right?

    I mean, my understanding was that the plan for demonstrating QS involved running random circuits without any kind of error correction.

    I don’t recall the details, but I don’t see how it would be compelling anymore. If it ever was, frankly it felt like a cop-out to me. And if error correction is required, then it really reschedules the whole thing.

    At that point, why even bother with random circuits? Sure it’s more efficient in terms of gate count but i’d rather see a half-size version of Simon’s algorithm than random circuits of questionable complexity.

  30. Gil Kalai Says:

    I also found the paper by Xun Gao, Luming Duan very interesting.
    Regarding noisy random circuits in the NISQ regime, what I would expect based on the analogy with my 2014 paper with Guy Kindler on noisy Boson Sampling is the following

    a) For very large amount of noise the outcome distribution will be close to uniform

    b) For smaller amount of noise the outcome distribution will represent a very low level complexity class easily simulable by classical computers. (We refer to the class as LDP for “Low Degree Polynomials” and it is well inside AC^0). The outcome distributions will be very far from the noiseless distribution. (Technically, only low level Fourier contributions will “survive”.)

    c) For even smaller level of noise, in a wide range of noise rates, the outcomes will be far apart from the noiseless distribution and will strongly depend on the fine parameters of the noise. In other words, the outcomes would be chaotic. (Technically, intermediate level Fourier contributions will “survive” but will be largely effected by fine parameters of the noise.)

    d) For tiny amount of noise, quantum supremacy will be demonstrated.

    Scott is correct in regarding the effort needed to reach step d) as “heroic”. I expect that the current experimental efforts even for a small number of qubits (9-20) represent the b)-c) noise levels and are far from what is needed to get close to the noiseless distributions.

    I am curious by the fact that the Gao – Duan result asserts that the distributions will be very close to uniform, namely that even low level Fourier levels have very small contributions.

    As some people may know, I regard the analysis of our paper, (and the conjecture that the results are widely extendable) as giving a good argument of computational complexity nature for why we cannot expect the heroic effort for reaching the low level of noise needed for quantum supremacy to succeed. (I also expect that this argument extends to the efforts to create stable qubits based on quantum error-correction, and will have further interesting consequences regarding noisy quantum systems.) As Scott said, the fact that a small amount of noise is devastating is not surprising. My interpretation, however, is still not widely understood or accepted and I myself will be happy to see further relevant theoretical and experimental results.

    The old important result by ABIN linked to in comment #2 asserts that reversible (“closed”) circuits can be simulated by log depth quantum circuits. Those circuits (and even constant depth quantum circuits) still allow quantum supremacy so the results are of different nature.

  31. Scott Says:

    Job #29: Again, we all knew from the beginning that you can’t scale to arbitrary sizes without error-correction. That’s not a surprise. That’s not new information. I said it on this blog to anyone who would listen.

    There’s hope that, in the near term, even without error-correction, it might be possible to do something that’s unequivocally a “quantum speedup,” by a factor of millions or billions (i.e., something clearly dependent on quantum mechanics, solving a well-defined problem, and not a close call). If so, then I think that would be of enormous scientific interest, if for no other reason than that it would refute Gil’s predictions above! 🙂 But everyone agrees that ultimately you need error-correction.

    An analogy: everyone agrees that, for sufficiently long human space missions, ultimately you’d need a way to grow your own food in space. But the fact that the Apollo program never even tackled the problem of growing food in space, doesn’t imply that it didn’t get to the moon.

  32. Job Says:

    An analogy: everyone agrees that, for sufficiently long human space missions, ultimately you’d need a way to grow your own food in space. But the fact that the Apollo program never even tackled the problem of growing food in space, doesn’t imply that it didn’t get to the moon.

    I understand that point and i agree.

    Here’s what seems absurd to me about demonstrating QS using random circuits.

    We already have access to quantum systems, and we already have evidence that they are difficult to simulate. Building a QC does not change that.

    It’s the fact that a QC is a programmable quantum system, which can be precisely manipulated, that’s ground breaking.

    Running random circuits has got to be the most tenuous demonstration of programmability that’s possible under the circumstances.

    It’s an important step along the way, but does it live up to the Quantum Supremacy hype? Not for me.

  33. Scott Says:

    Job #32: Sure, there are many quantum systems in nature that seem hard to simulate. But with none of them has there been a clean separation enforced between

    (1) the reasons why it’s hard to simulate that involve quantum computational complexity (i.e., interference among exp(n) amplitudes), and

    (2) all the other reasons why a physical system might be hard to simulate in practice, but which also apply to classical systems (large constant overheads, discretization and truncation errors, lack of full knowledge of the Hamiltonian…)

    With none of these systems would it make much sense to say that they’d failed to solve the problem they were given: by definition, every physical system succeeds perfectly at simulating itself! But in my book, a thing that can never output a wrong answer, even in principle, is not a “computer.” 🙂

    Nor, with these systems, has one had any obvious way to generate a huge collection of instances of arbitrary sizes. Nor has the assumed hardness of simulating these systems been based on clear hardness conjectures in theoretical computer science: it’s always been strictly empirical.

    The point of using a small-scale programmable QC for quantum supremacy experiments is that, if it works, it should address every one of the issues above.

  34. fred Says:

    Great updates, Scott! Thanks!

    – Regarding the elections, I guess people had to weigh voting Dem as a rebuke of Trump (through punishing the GOP) vs voting for the right local person, regardless of their party.

    – When it comes to QC, what seems clear is that it’s all going to be about the practicality of error correction?
    For example I’ve read a few papers using Von Neumann’s analysis of error correction for classical computers as the basis of the feasibility of error correction for QC, but the thing is that Von Neumann’s scheme was never implemented practically or never even needed (because individual transistors are near perfect).

    – A totally off-topic question for physicists/cosmologists:
    regarding the edge of our observable universe (i.e. the future light cone since the big bang , centered on us) and the apparent accelerating expansion rate of the universe: the further a galaxy is from us, the faster it’s drifting away, eventually exceeding the speed of light and leaving our future light cone… so that there will be a day where the only matter we see in our observable universe will be our own galaxy…
    Isn’t this suggesting that the edge of our observable universe is very much like a black hole event horizon? And that the acceleration of distant objects could be caused by that black hole gravity? (it’s an inverted black hole… a bit like if you draw a circle on 2D sphere, it’s a circle whether a point is inside it or outside it).
    If that’s true, we should see galaxies that cross our light cone as “frozen in time” (just like it happens for objects crossing an event horizon).
    Running time backwards, it would make it look like we’re at the center of a black hole, with everything coming down from the outside, crossing its event horizon, and converging in to the singularity that is the big bang.

  35. Vincent Says:

    Scott #21: yes I’m all in favor of moderation queues, it is just that I liked the old system, where the site would tell me that my comment was in the moderation queue, better than the new system where the comment just disappears (and then re-appears after it is out of moderation). But from your comment 21 I infer that this is not a change you made consciously. Perhaps it is some WordPress-wide phenomenon?

  36. fred Says:

    About mass shootings, I’m getting the impression that this has now clearly become a “fact of life” in the public American psyche.
    One can argue that since the Sandy Hook Massacre didn’t make a serious dent, nothing else will (what’s more horrible than killing 20 seven years olds?)

    Five days after the Las Vegas shooting, no-one talked about it, very little attention was given to the victims. Although a recent edition of “60 minutes” had a piece about what some of the victims are going through)… and it seems that the conclusion was that the only practical way to deal with those mass shootings is to make “trauma kits” (with tourniquets to stop heavy bleeding from wounds caused by assault rifle bullets) as available as possible.

    It’s really like traffic accidents – ~100 ppl are killed every day in the US (37,461 deaths in 2016).
    It’s never in the news unless some bus crashes and a bunch of 20 ppl die at once. Then it’s news worthy. Society has just blocked this out of its mind.

    Death by guns are at the same level (33,636 deaths in 2013, including gun suicides).
    It’s also only registering in the news when there are more than 10 victims as once.
    So I think people are just getting fatalistic about it, just like for traffic accidents.
    Of course mass shootings in particular have a psychological aspect, often targeting specific victims (kids, whites, non-whites, Jews, Catholics, Muslims, non-Muslims… not true for Las Vegas)… so that aspect tends to “even out” across all instances (not for the victims, but for society in general… the insanity is across the board).

  37. fred Says:

    Scott #26

    “heroic engineering efforts like the ones now being undertaken are actually needed if you want near-term quantum supremacy.”

    I’ve thinking a lot about that “heroic” part.

    Specifically, I was wondering if we were in the same situation in the early days of computing, or possibly even worse (from “this is a long shot” point of view).

    Like, Turing and co had to deliver a working computer to break the Nazi codes. That’s one type of pressure and heroism. They had no time to worry about scalability beyond the task at hand I guess.

    After that, when things quiet down (ww2 was over), I’m trying to picture what it was like to work with the ENIAC and its 20,000 vacuum tubes: “Several tubes burned out almost every day, leaving ENIAC nonfunctional about half the time”.
    One wonders whether the engineers dealing with those failures were really that optimistic that things could eventually be scaled up.

    But it’s the fact that those issues have been solved that’s driving the engineers working on QC… even though the issues seem qualitatively very different.

  38. Vincent Says:

    Scott #19 Thanks, that is a nice clarification. Inspired by your comment I looked up the original Quantum PCA paper and it seems they do indeed claim a exponential speedup but only for a very specific problem. All in all their claim rings close to the phrase “something clearly dependent on quantum mechanics, solving a well-defined problem, and not a close call” taken from your (unrelated) comment 31. Except that now, with the new classical algorithm of Ewin Tang, it has become a close call after all. Thanks, I think that this was the perspective I was missing. Also, for my own PCA’s I will, for the time being, stick to the older classical algorithms and await further developments.

  39. John K Clark Says:

    I agree that given the disastrous nature of the Trump presidency the election should have gone better, but on the other hand the democrats wrestled 7 governorships from the republicans, captured 323 Republican-held seats in state legislative bodies, and gained control of the House of Representatives in the largest democratic increase since 1976. The only setback, and it’s a big one, is the democrats lost at least 2 Senate seats and probably 3 or 4; but that’s only because of our screwy way of picking senators. If you add up all the votes in senatorial elections democratic candidates got 11.8 million more votes than the republican candidates.

    John K Clark

  40. fred Says:

    Clark #39

    “but that’s only because of our screwy way of picking senators. If you add up all the votes in senatorial elections democratic candidates got 11.8 million more votes than the republican candidates”

    There’s no reason why this couldn’t also work in the favor of democrats though (of course whoever is in charge will redraw voting boundaries in their favor?).
    That’s just the way things work in a federal system. It’s a trade-off.

  41. Scott Says:

    fred #40: Actually, the fact that every state gets two Senators systematically favors low-population states and hence rural voters. And in some sense, it was explicitly designed that way from the beginning: the slave states were worried that if the representation were proportional, they’d be dominated by free states like New York. The compromise—a rather messy and complicated one, certainly in hindsight!—was also to have a House of Representatives.

  42. Sniffnoy Says:

    fred #40:

    There’s no reason why this couldn’t also work in the favor of democrats though (of course whoever is in charge will redraw voting boundaries in their favor?).

    Um, you can’t just redraw the boundaries of the Senate, dude…

    (Also ideally, for the House, where the districts can be redrawn, this would be done in a neutral way — say, by a pre-agreed-upon algorithm — and not be within human control to be gerrymandered at all. That it can be biased towards either side rather than just one is hardly a recommendation!)

    That’s just the way things work in a federal system. It’s a trade-off.

    It’s actually not particularly clear that federalism requires this in any way. Nothing about federalism requires any of:
    1. Human-drawn district boundaries (for the House)
    2. For that matter, the use of a voting method that has the vote-splitting problem, thereby driving the country towards a two-party equilibrium where gerrymandering is easily accomplished and seems like a natural thing to do
    3. The existence of a Senate, with boundaries based on states and no adjustment for population
    4. That states should elect their House members via districts, rather than general ticket (which is not a constitutional requirement, just federal law as of 1967; prior to that general ticket was used — although personally I am wary of general ticket for its dependence on political parties, and would prefer one of the prop-rep system that doesn’t rely on parties)
    5. That House members should be associated to states at all (with the guarantee of at least one per state distorting the proportionality), rather than dividing up districts independently of states (or using a nationwide general ticket, although, again, I’m wary of that)
    6. Again, the use of a voting method that isn’t prop-rep and so makes gerrymandering a concern at all

    So, suppose we abolished the Senate and the idea of congressional districts, and elected the House via a nationwide prop-rep general ticket method (that would probably rely on parties, ugh, but the point of this example isn’t to be what I want). What, then, would remain of federalism? Answer: Basically all of it! What makes it federal is how the states make their own laws and set up their own government as they please, and that the federal government is limited in what it can tell them to do and how it can restrict them; what makes it federal isn’t how the federal government’s legislators are selected in a state-based manner.

  43. Sniffnoy Says:

    Scott #41: From the Northern point of view, of course, the compromise was to also have a Senate, and (let’s not forget the other compromise) to agree to, rather than discounting slaves entirely from House allotment (despite their inability to vote), to just count them less. Giving the slave-owners both a method that isn’t based on population at all, and a method that effectively counts their slaves partially towards themselves.

  44. James Vogan Says:

    fred @37, re vacuum-tube computers, just as a piece of anecdotal historical information, in the 1960’s, my department at GE was doing a huge amount of computing on an IBM 704 mainframe which used water-cooled vacuum tubes and I don’t recall ever hearing about it being off-line during working hours. (We typed programs and data onto punched cards with key-punch machines and sent them to the mainframe via hourly couriers, who brought back newspaper-sized printouts, rolled up in plastic bags. I used to arrive early in the morning and use a shopping cart to bring all the printouts from the drop-off site back to my office and distribute them among the cubicles. Great times.)

    As a more general note, engineered devices start simple and evolve to more complex. The best example I saw of this was in an oil painting at Cooper Energy Services, who invented probably the first farm tractor in 1854. They built steam-engines which were dragged on wagons, along with a wagon full of coal, by about 16 draft horses, out to fields to drive thresher machines. Some engineer had the idea to add a bevel gear between the drive shaft of the steam engine and the rear axle of the wagon, and the oil painting proudly showed the result: only two draft horses were now needed at the front, for steering. Steering wheels came later. As did water-cooling for vacuum tubes.

  45. fred Says:

    Scott #41.

    It might favor low-population states, but in practice it still seems that the Dems held the House more often than the GOP… 49/92? Still pretty balanced.

    https://history.house.gov/Institution/Party-Divisions/Party-Divisions/

  46. fred Says:

    On thing to add:
    the US is basically a two party system… so you really want ideally something like an average 50/50 alternation of the parties. Just because the dems have a few millions more voters on average, would you really want the Dems to be in power all the time?! Might as well be like China then…

    If the country had 10 different parties, that’d be a different story, you would want some sort of more proportional representation of each party.

  47. Scott Says:

    fred #46: I’m no one’s idea of a doctrinaire leftist. I’d be very happy if the US had a principled conservative, libertarian, or classical liberal party that could serve as a check on the Democrats and that was actually viable in elections. If such a party existed, I’d probably vote for some of its candidates sometimes. But the existing Republican party? It should be burned to a crisp.

  48. Sniffnoy Says:

    fred #45:

    It’s the Senate that was being discussed as favoring low-population states, not the House! I mean, the House obviously does to some extent as well, but it’s a much smaller effect and not what was being talked about. Stop getting these mixed up!

    (And note that when we’re talking about the whole North-vs-South thing as the constitution was framed, the key point about the South wasn’t so much that it was low in population, so much as that it was low in free population…)

    But also, as for your historical comparison… what time period are you looking at here? You can’t just look at today’s party divisions and transpose those to the past as if things were the same then. It’s true that right now there’s a substantial divide of urban Democrats vs rural Republicans, but things weren’t always split up that way. The parties weren’t even always so ideological — for a long time they were more geographic than anything. And when they were ideological those ideologies were not always the same as today. Remember, the Democrats were the party of the South for a long time. “Republican” and “low-population state” were not always things that went together.

    (But seriously, man, you might want to start by getting straight what house of Congress is being discussed.)

  49. John K Clark Says:

    Unlike the Senate the elections in the House are suposed to correspond with the popular vote, but they don’t. In 2014 in House elections the democrats got 2,456,929 more votes than the republicans but the republicans gained 8 House seats and the democrats lost 8 seats. The cause of this is republican gerrymandering, voters are supposed to choose their representatives but instead representatives choose their voters.

    John K Clark

  50. John Stricker Says:

    Vincent #35

    Seconded.

  51. Job Says:

    The point of using a small-scale programmable QC for quantum supremacy experiments is that, if it works, it should address every one of the issues above.

    There’s also this implied requirement that the QS experiment outperform even the largest classical computers available, brute force or not.

    On the surface it seems reasonable to demand that, but it stretches the experiment to such an extent that the outcome becomes way too statistical. And for what?

    With 52 qubits you could run Simon’s algorithm on 25-bit functions. It tests large-scale interference and the results can be conclusively verified. If a machine can do that, it’s a quantum computer.

    You can even stick a random classical circuit in front of the input function, it will still work.

    But i guess 2^25 is easily within reach of classical computers, so somehow it’s not compelling even though asymptotically the classical algorithm is provably much slower?

    I don’t understand that.

  52. Michael Says:

    @Scott#40- No, the original purpose of the Senate was to favor small states, not slave states. Wikipedia has a map of the 1792 election:
    https://en.wikipedia.org/wiki/United_States_presidential_election,_1792
    The number of electors roughly correspond to the population of the states. Both the North and the South have small and large states- the North has New York with 12 and Pennsylvania with 15, the South has Virginia with 21 and North Carolina with 12. And similarly, when it comes to small states, the North has Vermont with 3 and the South has Georgia with 4.
    Nor was the Senate intended to favor rural states. As late as 1800, only 6% of the population of the United States was urban:
    https://www.seniorliving.org/history/1800-1990-changes-urbanrural-us-population/
    The Senate was originally intended to favor small states- it was needed to get the smaller states to sign on to the Constitution. There WERE parts of the Constitution that favored the slave states- the Three Fifths Compromise, for example- but the Senate wasn’t one of them.

  53. Sniffnoy Says:

    Michael #52:

    (Your comment wasn’t in reply to me, but I also made this mistaken claim, so…)

    It seems you’re right. I thought I remembered reading something arguing that despite conventional wisdom it was actually about slavery, but, um, it seems I just completely misremembered and that reference does not exist. Looks like the conventional wisdom is correct here. My mistake.

  54. Sniffnoy Says:

    Oh! I think I know what mistake I was making. I was getting the senate confused with the electoral college. It’s commonly said that the electoral college is about helping small states, but in reality it actually had more to do with helping slave states. The senate, though, really was about the small states, contrary to what I said above. Oops.

  55. Jelmer Renema Says:

    fred 46: that’s not how politics works. The optimal positioning in a two-party system is epsilon to the left and right of wherever the mean voter is. So if you change the rules to shift that center, the parties will just shift along with it.

  56. Andrei Says:

    Who is/was David J. Bruton? I see that you are David J. Bruton Centennial Professor of Computer Science, but why is a professorship named after that person? I first thought he was some sort of Knuth-level computer scientist I had never heard of somehow, but the only thing I can find on Google is references to you.

  57. Scott Says:

    Andrei #56: David J. Bruton made a substantial donation to UT (isn’t that the most common way to get your name on a professorship?). I believe there are several other David J. Bruton Professors on campus.

  58. Scott Says:

    Slave-vs-small-staters: I stand corrected, thanks!

  59. the problem of the gatkeepers Says:

    On #10, obviously you and the American Jewish community have my deepest sympathies. Antisemitism, and in particular murderous Antisemitism, has no place in my worldview and has no place in a civilized society. I must note that the killer in the Pittsburgh massacre was a Trump hater and that Trump has a daughter who converted to Judaism because of her husband and gave Trump three Jewish grandchildren. Trump has also been the best friend Israel has had in the White House in decades, finally having the courage to move the US embassy from Tel Aviv to Jerusalem. I am not sure why the massacre should be blamed on Cruz or Republican politicians when you have the Democratic establishment that has not fully distanced itself from avowed Jewish hater Louis Farrakhan.

    On #1, I think that Democrats overplayed their hand with Kavanaugh. Until then, the midterms of 2018 were headed to being classic “the party not holding the White House mobilizing its base to show their opposition at the polls”. The treatment Kavanaugh received -that we now know included two women who are on record acknowledging that they made up the charges- prompted many Republicans who were not particularly excited about the midterms to vote. The result is the midterm election with the highest voter turnout since 1966: a mixture of pissed off base Democrats (that was expected) and pissed off base Republicans (many of whom, including yours truly, would not have come out to vote had it not been because of the vicious treatment Kavanaugh received). I also find very ironic that many Democrats were celebrating the historic expected turnout in 2018 before the night of the election expecting that the high participation was going to be a “blue wave” and instead got pissed off Republicans who got out to vote to show their displeasure with “the Resistance”. These Democrats would be well served by watching outlets different than CNN. Even MSNBC is fairer these days than the CNN echo chamber.

  60. Douglas Knight Says:

    Sniffnoy 51:

    the key point about the South wasn’t so much that it was low in population, so much as that it was low in free population

    You seem to have retracted your comments based on this factual claim in response to evidence that doesn’t actually contradict it (although some that renders it moot). But the claim is false. In the 1790 census, Virginia had the largest free population (although below MA+ME).

    54, In what sense does the electoral college help slave states? It’s just a compromise between the House and the Senate. Those institutions do what they do, but the College doesn’t add anything to it. [I assume you mean electoral votes, rather than Electors. I wish people would say which aspect they mean, rather than complaining about the EC in toto.]

  61. Andrei Says:

    Scott #57: Thank you for your quick answer!

  62. fred Says:

    Sniffoy,

    I’m aware of the differences between the House and the Senate.

    But my point about an ideal 50/50 alternation in a two party system holds for the Senate as well.

    https://en.wikipedia.org/wiki/Party_divisions_of_United_States_Congresses#Partisan_control_of_Congress

    Party Senate House Presidency
    Democratic 50 57 44
    Republican 42 36 45

    I think that this 50/50 alternation is not as much of a result of the specific rules, but a reflection of history being cyclical: we have generational reset and the fact that results aren’t all that different based on who’s in power.

  63. Peter Michael Gerdes Says:

    But why today? Surely we shouldn’t have substantially raised our prob that a given synagogue would be shot up any day.

    Uggh, I really hated that aspect of living in Israel. Not particularly the security (it’s rare outside airports) but the willingness of everyone to just come up and challenge your right to be somewhere. I kinda liked the willingness to but in and offer unsolicited advice but I when it is realized as a challenge to your belonging it’s the ugly part.

    Of course native Israelis didn’t even seem to blink when some religious zealot yelled at them on the street for a skirt above the knee so this is just prob my cultural background speaking up.

  64. Sniffnoy Says:

    Douglas Knight #60:

    You seem to have retracted your comments based on this factual claim in response to evidence that doesn’t actually contradict it (although some that renders it moot). But the claim is false. In the 1790 census, Virginia had the largest free population (although below MA+ME).

    Virginia was largest, yes, but in total the South was smaller by a substantial margin.

    54, In what sense does the electoral college help slave states? It’s just a compromise between the House and the Senate. Those institutions do what they do, but the College doesn’t add anything to it. [I assume you mean electoral votes, rather than Electors. I wish people would say which aspect they mean, rather than complaining about the EC in toto.]

    Well, so the first question here is, what are we contrasting it to? We have to look at what other options were considered at the time — one of which was direct popular vote. Why was it picked over direct popular vote? Because it was thought the slave states wouldn’t go for that. But with the electoral college, the 3/5 compromise not only counted towards how many representatives they got in the house, but also towards how many electoral votes they got (since these are the same number, plus or minus two).

    If you’re contrasting it to having one of the houses of Congress elect the president, then yes, it looks like a compromise between them — but the reason they didn’t go with having Congress elect the president doesn’t seem to have much to do with which states would win or lose in such an arrangement. In any case, like I said, I was contrasting against a popular vote.

  65. Douglas Knight Says:

    Sure, they considered direct vote, but that’s not representative of what they considered. Also, as I understand it, direct popular vote was rejected in favor of an electoral college (of as yet undecided weight and provenance) not in the interest of slave states, but mainly because it was unwieldy. How do you get a list of candidates to vote among? FPTP is an invitation to political parties, although they didn’t see it coming. Instead they were afraid of deadlock between local candidates. A College allowed people to get together in one room and negotiate.

  66. David Speyer Says:

    @Sniffnoy “one of the prop-rep system that doesn’t rely on parties” Reference for these? I’ve heard of liquid democracy, but that sounds too complex for any normal population of people. I thought that prop rep otherwise always needed a body in charge of creating party lists.

  67. David Speyer Says:

    @fred “Just because the dems have a few millions more voters on average, would you really want the Dems to be in power all the time?!”

    Both parties have strong incentives to appeal to the voter who will tip control one way or the other — the median in simple proportional representation, or the median voter of the median state in our current system. If the rules of representation were changed, I am confident that the Republican party would still win 50% of the elections, it would just shift its positions to the left to do so.

  68. Douglas Knight Says:

    David 66,

    Appropriate systems for electing multiple representatives per district have most of the advantages attributed to proportional representation. The key is that once a faction uses their votes to elect one representative, their votes are used up and so a majority can’t control all the seats. This generally requires ranked choice to avoid wasting votes. The party simplifies the process of creating the ranking. I wouldn’t call it proportional representation, because what is proportional to what? I guess representation is proportional to support of hypothetical informal factions, proto-parties. You probably have experience with such a system: Cambridge City Council, which does use the term proportional representation.

  69. Jr Says:

    I have been to synagogues in Europe as a tourist or on field trip with school. They have all had pretty tight security and I think almost all synagogues in Europe have. It is a symptom of our times that the US, who seemed to have at least mostly moved past antisemitism, now also have guards at synagogues.

  70. Andrew Krause Says:

    Unrelated to your list above, but vaguely related to “Why haven’t you blogged about…?” I was wondering if there had been any updates regarding the Coffee Automaton paper? I remember reading some time ago that there were some technical issues that needed to be further considered, but the story of the paper definitely has inspired a bit of my own thinking.

    Anyway, I hope you are well. I deeply appreciate the efforts you take to share insights and experiences on your blog. 🙂

  71. Jelmer Renema Says:

    The assumption underlying this discussion (namely that political parties are bad) seems to me unjustified. Historically, mass parties are the best way for disadvantaged people (e.g. the newly enfranchised) to engage with the political process. The assumption that political parties must be avoided is peculiarly American and seems to me to be an artifact of your constitutional history.

  72. Sniffnoy Says:

    Hoo, OK, trying to get back to some of this quickly:

    David Speyer #66:

    Douglas Knight has already basically answered this. I think “proportional representation” in this more general setting means, degenerates to proportional representation when people do vote party-line. As mentioned, STV is one such method (bad as it is for single-winner elections it seems to be OK for this?); I think there are others.

    Douglas Knight #65:

    Sure, they considered direct vote, but that’s not representative of what they considered.

    Perhaps so, but my point is that, among the alternatives, a key reason it was rejected was concern that slave states wouldn’t go for it.

    As for why I say that, well, I’ll just pull from Wikipedia here, which quotes Madison:

    There was one difficulty however of a serious nature attending an immediate choice by the people. The right of suffrage was much more diffusive in the Northern than the Southern States; and the latter could have no influence in the election on the score of Negroes. The substitution of electors obviated this difficulty and seemed on the whole to be liable to the fewest objections.

    Jelmer Renema #71:

    An artifact of our constitutional history? It’s older than the constitution. There’s a relation, in that it comes out of the same philosophy that produced the constitution, but it’s not a causal one like you say.

    I’ll skip the actual argument for this position as it would basically be the standard one. 😛 (I do agree that no parties at all is not a realistic goal and that parties are useful enough that you don’t want to eliminate them entirely… so long as they’re, y’know, only powerful enough to get done the useful parts while minimizing the bad parts. Ideally they’re numerous and competitive, as opposed to the duopoly we have now.)

  73. Sniffnoy Says:

    Actually, although I called parties a necessary evil above, maybe you could get rid of them in a delegative democracy? Probably not, but…

  74. Peter David Jones Says:

    Any thoughts on Mikhael Dyakonov’s dismissal of your whole field?

    https://blog.adafruit.com/2018/11/15/the-case-against-quantum-computing-by-mikhail-dyakonov-ieeespectrum/

  75. Scott Says:

    Peter #74: You can see my 2013 responses to Dyakonov here. Is there any new argument in this latest piece, which would require a new response?

    He’s right that the applications of QC have often been grotesquely overhyped in the press (but any reader of this blog surely knew that). He’s right that building a scalable QC is unbelievably hard. He’s wrong in almost everything he says about fault-tolerance, with the most egregious howler being that no one has any idea how to account for imprecision in preparing the initial states and applying the gates. He apparently still doesn’t understand that the fault-tolerance theorem handles such imprecisions in exactly the same way it handles environmental decoherence—it all gets folded into the same error budget—and the implication that no one in QC would’ve noticed that gates are imperfect is laughable.

    (Also, he’s badly out of date when he says the best current QC experiments are with 5 qubits: has he not heard that IBM has a 20-qubit device currently available for public use? That Google, IonQ, and others have likewise gotten good results with 20+ qubit experiments? What about the 50+ qubit experiments of Misha Lukin and others with optical lattices?)

    Truthfully, though, in the years before quantum computational supremacy first gets convincingly demonstrated, I’m happy to have as many Dyakonovs as possible confidently predicting that it’s impossible. What I worry about more right now, is all the people who will say it was completely obvious all along that quantum supremacy was possible, it’s just 1926 QM, so what was the point of doing it? 🙂

  76. Jim Graber Says:

    So I was also going to ask you about Dyakanov’s latest. In particular about the paragraph about Intel, IBM and Google (49, 50, 72 qubit devices). Aren’t they quite over due by now?
    A year or two ago, I was very optimistic due to the successes of the 19 and 20 qubit devices. Now I am getting more and more pessimistic due to the delayed success of these bigger devices. Your take? TIA

  77. Scott Says:

    Jim #76: It’s unfortunate when the public’s information about quantum computing comes from, on the one hand, breathless popular articles that uncritically repeat companies’ PR, and on the other hand, skeptics who can’t be bothered to learn the basic facts about fault-tolerance or about experiments that have already been done. There’s a truth of the matter, which neither of those sources are going to tell you. As far as I can tell, the truth is:

    (1) Getting 50-70 superconducting or trapped-ion qubits to work well together is damn hard. And the experimenters, being human, do indeed tend to be systematically overoptimistic about how long it will take—just like the planners of just about every other science and engineering and construction project on the planet.

    (2) Contra Dyakonov, absolutely nothing has been discovered to suggest that it’s impossible. You can laugh about delays and missed milestones, just like people in the 1890s laughed at the delays in getting heavier-than-air flight, and people in the 1990s laughed at the repeated failures to make a commercially viable version of what we now call the smartphone. But absent some deep new scientific discovery that explains why the technology (or the existing approaches to it) can’t work, this is only a temporarily safe position, right until it’s no longer right.

  78. Sniffnoy Says:

    Scott, your SSL’s certificate expired again…

  79. John Stricker Says:

    @Sniffnoy #73

    It would seem you are in esteemed company:

    lawsandsausagescomic.com/comic/701

    A most excellent webcomic on the US-american political system and its history, by Zach Weinersmith (of Saturday Morning Breakfast Cereal fame) and his brother Greg Weiner, a political science professor.

  80. Scott Says:

    Sniffnoy #78: Indeed! I just called Bluehost tech support and got them to fix it. Alas, they say the problem is going to recur in May of 2019, and there’s nothing I can realistically do about it now (that is, nothing that doesn’t involve steps that I have not the slightest clue how to take). All I can do is place another desperate call to tech support then. I’ve been in this hell, with this Sword of SSL Damocles over my head, ever since I accepted many readers’ urgings to set up the SSL support in the first place.

  81. John Sidles Says:

    Scott asserts (circa )  “Absolutely nothing has been discovered [recently] to suggest that it [scalable quantum computing] is impossible.”

    Fresh support for Scott’s assertion is provided by this week’s article “Quantum Error Correction Decoheres Noise“, by Beale, Wallman, Gutierrez, Brown, and Laflamme (Phys. Rev. Lett. and arXiv:1805.08802 [quant-ph]). The starting point of their analysis is:

    Most studies of the performance of quantum error-correcting codes only consider probabilistic Pauli errors because they are easy to simulate … However, in real systems, it is likely that other noise will also be present. … Determining the performance of an error-correcting code at the logical level under general noise is complicated because such noise is harder to simulate.

    Here “general noise” encompasses the entire class of physical mechanisms that are generically associated to quantum electrodynamical superradiance/subradiance (which is a very large class indeed). The authors assert:

    In this paper, we have shown that for generic local noise, coherent errors are decohered by syndrome measurements in error correcting stabilizer codes.

    In a nutshell — and subject to my own potentially wrong understanding of this article — successive rounds of quantum error correction serve two purposes: (1) decoherent errors are corrected, and (2) coherent errors are converted to decoherent errors … with the latter errors being correctable by subsequent rounds of error-correction.

    For me, the idea of dual-purpose quantum error-correction was both entirely new and absolutely essential to the (eventual) practical feasibility of scalable quantum computing … and so I regard this article’s ideas as being absolutely seminal and wonderful.

    On the other hand, superradiant electrodynamic noise is sufficiently general and sufficiently subtle — and in particular sufficiently nonlocal — that the article’s conclusions in regard to the error-correction of “generic local noise” may not suffice for the error-correction of (real-world) “generic non-local noise” (e.g., Hilbert-space analogs of rowhammer).

    In view of these considerations, it is mathematically plausible — and even physically likely (in my view) — that the skeptical predictions of Gil Kalai’s ICM 2018 Plenary Lecture “Noise stability, noise sensitivity and the quantum computer puzzle” will prove, in the long run, to be well-grounded in superradiant quantum electrodynamics … despite this week’s insightfully ingenious new insights relating to the dual purposes of quantum error-correction.

  82. Sniffnoy Says:

    John Stricker #79:

    I’m confused as to why you are grouping me together with Weiner (or in this case Azari) there? I don’t seem to be agreeing much there. Well — perhaps agreeing on the facts, but not on the emphasis of those facts. 😛 I’m saying that parties are basically bad but I guess can’t be done away with entirely, while Azari is saying that parties are pretty helpful but have their bad points. Well, OK, maybe that’s not so much of a disagreement! But Azari doesn’t mention a number of what I consider to be key downsides to parties. In particular, I disagree with her assessment of “partisanship” (in the particular sense that she defines it) as the key problem. As to just what those problems are, well, I don’t want to start on about that here. But suffice it to say that I think the problems with parties are more extensive than she describes, and when she says “It’s this sort of thing that people tend to object to when they object to political parties,” I say, no, I object to much more than that!

    Well, OK, I will say one more thing. Azari seems to implicitly contrast parties against a notional state of unity and agreement. E.g., she states, “Parties formed pretty quickly … because people had genuine disagreements”. But I’m not contrasting parties to agreement and unity — that sounds worse! You want (effective) one-party rule? I don’t. I’m contrasting parties to individualism. I say, think for yourself! Obviously, parties form because people agree on a particular thing — duh! And there’s nothing wrong with agreement, but there is with turning that agreement into ossification, which is what parties do.

    OK, fine, make that two more things. I also think Azari’s implicitly conflating multiple levels here. What we want is individualism and disagreement on the object level issues, but respect and good faith as the meta-level environment — everyone A. thinking for themselves, and B. arguing honestly and assuming good faith. I’m not sure Azari is observing this distinction between these levels.

  83. fred Says:

    Scott #74
    “how to account for imprecision in preparing the initial states and applying the gates. He apparently still doesn’t understand that the fault-tolerance theorem handles such imprecisions in exactly the same way it handles environmental decoherence—it all gets folded into the same error budget”

    Those imprecisions you’re talking about, if not handled, would they be causing imperfect cancelation of all the wrong/unwanted solutions in Shor’s algorithm?

    But those unwanted solutions grow exponentially, right?
    It seems amazing that fault-tolerance that’s not exponentially expensive could take care of this (because original fault-tolerance for classical computers didn’t have to deal with the exponential explosion of QM solution state, no?).

  84. Gil Kalai Says:

    Scott (#31, #77) and John Sidles (#81).

    Thanks for the kind mentioning of my work, John. First a point of very elementary logic. Instead of

    “You can laugh about delays and missed milestones, … . But absent some deep new scientific discovery that explains why the technology (or the existing approaches to it) can’t work, this is only a temporarily safe position, right until it’s no longer right. ”

    I think it makes more sense to say at the end “… this is only a temporarily safe position, right until it’s no longer right, or until some deep new scientific discovery that explains why the technology (or the existing approaches to it) can’t work is made”

    (Maybe this was what Scott meant all along.)

    As for deep discoveries with the potential of explaining why QC cannot work. One such possible discovery is quantum error correction. (Not fresh new but rather new.) Here, think about the analogy with non deterministic computation and imagine a universe where for a couple decades after non deterministic computations were discovered people mainly tried to build efficient algorithm for non-deterministic computation, and only then realized, rather sadly, that non-deterministic computation is important because it cannot be efficiently achieved. Similarly, a discovery that local quantum systems do not support quantum error correction could be (while sad) quite important in various contexts.

    Another potential relevant discovery is the notion of noise stability. A general law suggested from my study and mentioned in the video that John linked to is that noiseless (or almost noiseless) quantum states and evolutions (in “local physics”) are “noise stable” in the sense of my 1999 work with Benjamini and Schramm. I should mention that extending the scope of noise stability/noise sensitivity to various physical systems is by itself a potentially a deep mathematical task, of independent interest.

    There are various potential insights and consequences from failure-in-principle of quantum fault-tolerance. (Even for limited or temporary principles.) Consequences for the fascinating quantum gadgets related to topological quantum computing are certainly of much interest. There is an analogy which I would love to learn more about between “Majorana states” in condensed matter physics (that are related to topological quantum computing) with Majorana fermions in high energy physics. (I should say that the analogy itself is somewhat controversial among experts.) Obstacles for the ability to demonstrate stable “Majorana states” in condensed matter physics of the kind required for topological quantum computing may have some relevance for understanding the role of Majorana fermions in high energy physics and, in particular, regarding the hypothesis that neutrinos are (in a certain sense) Majorana fermions.

  85. John Stricker Says:

    @Sniffnoy #82

    By “esteemed company” I actually meant Thomas Jefferson, in the second comic panel! Which I realize I didn’t make clear at all, for which I apologize. I did find your further analysis very interesting, so maybe I should be unclear more often! Nah, that would be rude… 😉

    For further reference, the extended quote by Jefferson:

    “I never submitted the whole system of my opinions to the creed of any party of men whatever in religion, in philosophy, in politics, or in anything else where I was capable of thinking for myself. Such an addiction is the last degradation of a free and moral agent. If I could not go to heaven but with a party, I would not go there at all.”

    Which fits in very well with your points about individualism and thinking for oneself. The notion of ‘party’ in the quote seems to be more general than just political, anyway.

  86. fred Says:

    Is it just me, or it seems that Scott is slowly getting less and less confident that quantum noise isn’t gonna ruin the QC cake?

    “every informed person knew (and knows) that getting the physical error low enough with 50-70 integrated qubits would be incredibly hard, and that there was no guarantee of success.”

    “He’s right that the applications of QC have often been grotesquely overhyped in the press (but any reader of this blog surely knew that). He’s right that building a scalable QC is unbelievably hard.”

    “Getting 50-70 superconducting or trapped-ion qubits to work well together is damn hard.”

    Ok, it’s gonna be incredibly, unbelievably, damn hard…

  87. Job Says:

    Here, think about the analogy with non deterministic computation and imagine a universe where for a couple decades after non deterministic computations were discovered people mainly tried to build efficient algorithm for non-deterministic computation, and only then realized, rather sadly, that non-deterministic computation is important because it cannot be efficiently achieved. Similarly, a discovery that local quantum systems do not support quantum error correction could be (while sad) quite important in various contexts.

    There is a difference though in that Quantum Computing does not fundamentally change anything.

    And the most we get from a QC is the ability to simulate the universe, which is something we largely take for granted anyway, outside of the quantum world.

    One problem i see with implementation-as-a-barrier to Quantum Computing is that those arguments don’t change if BQP = BPP.

    Basically, in a world where building an actual quantum computer is pointless, it’s still impossible to build one. It makes these objections completely orthogonal to quantum computing.

  88. John Sidles Says:

    Gil Kalai affirms (circa #84) “Extending the scope of noise stability/noise sensitivity to various physical systems is by itself a potentially a deep mathematical task, of independent interest.”

    A pair of ICM 2018 lectures that jointly compose a student-friendly mathematical introduction to noise stability are Sanjeev Arora’s “The mathematics of machine learning and deep learning” (ICM Plenary Lecture #15), specifically Arora’s discussion of noise stability — starting at t=0h51m25s — which was followed by Gil Kalai’s above-mentioned “Noise stability, noise sensitivity and the quantum computer puzzle” (ICM Plenary Lecture #19).

    Shtetl Optimized readers who are fans of the Extended Church-Turing Thesis — these ECT fans definitely include me! — will delight that the summary conclusions of Sanjeev’s lecture extend naturally to the summary conclusions of Gil’s lecture. As Arora notes:

    We have little idea at an operational level of how humans think … in fact, machine learning is currently the best hope for learning how humans think …

    This is the hottest trend in academia: apply machine learning to discipline “X” …

    Machine learning speaks to age-old wonders and mysteries … it’s a new way of looking at the world, via complicated inexact descriptions that we embrace … a new frontier for knowledge and math …I am optimistic that deep-learning methods can be mathematically understood and/or simplified.

    As with machine learning, so too with the ECT … so much so, that Arora’s Plenary Lecture specifically draws attention (at time t=0h56m26s) to the overlapping mathematical framework of his research and Gil’s.

  89. Aula Says:

    Scott #80:

    All I can do is place another desperate call to tech support then.

    Why does the call have to be desperate? I mean, it’s not like the certificate is going to expire at some competely random moment which you can’t possibly know in advance. In fact, the current certificate will expire on February 17, 2019, so you could arrange to have it renewed on the 16th.

    Also, your link in comment #75 is broken.

  90. David Speyer Says:

    Even if Bluehost is incapable of making plans 15 months away, you could put a note on your google calendar to call Blue Host on Feb 1, 2019.

  91. John Sidles Says:

    Gil Kalai observes (circa #84) “There are various potential insights and consequences from failure-in-principle of quantum fault-tolerance (even for limited or temporary principles). Consequences for the fascinating quantum gadgets related to topological quantum computing are certainly of much interest.”

    Gil’s observation is amply supported by ECT-focussed readings of the quantum literature, as follows.

    Quantum Heritage: Superradiant Memory, Sensing, and Imaging

    Three seminal articles from the 1950s “Golden Era” of quantum physics are (in chronological order): (1)  “Coherence in spontaneous radiation processes” (Dicke, 1954, introduces the idea of “superradiance”), (2) “Radiation damping in magnetic resonance experiments” (Bloembergen and Pound, 1954, reinvents “superradiance”, however without using the word “super”), and (3)&nbsp “Spin echo serial storage memory” (Anderson, Garwin, Hahn, Horton, Tucker, and Walker, 1955).

    The Garwin-led IBM effort to build a quantum memory failed utterly … but if they had operated this same device in an imaging mode, they would have invented MRI imaging 25 years early!

    This appreciation informs our reading of articles like Barbara Terhal’s recent “Quantum supremacy, here we come” (Nature Physics, 2018), which is grounded in the physics that Terhal surveys in her outstandingly admirable monograph (as it seems to me) “Quantum error correction for quantum memories” (Rev. Mod. Phys., 2015).

    In a nutshell, this 1950s quantum literature teaches that modern-era, Kalai-predicted, ECT-grounded failures to demonstrate Quantum Supremacy can advance a valuable dual purpose, namely, the purpose of fundamentally advancing technologies for quantum sensing and imaging.

    Quantum Achievements: Fault-Tolerant Metrology Triangles

    In regard to working examples of Gil’s “fascinating quantum gadgets related to topological quantum computing”, it’s tough to better the cutting-edge quantum technologies that are surveyed in (for example) Scherer and Camarota’s “Quantum metrology triangle experiments: a status review” (Measurement Science and Technology, 2012).

    In view of this week’s redefinition of the Système International base units, quantum metrology triangle (QMT) experiments are now formally redefined to be counting experiments, that is, experiments whose data are dimensionless ratios of cumulative phase cycles (or equivalently, frequency-ratios).

    Hmmm … so it’s mathematically natural (isn’t it?) to view proposals for fault-tolerant quantum computations also as proposals for QMT-type counting experiments, in which the fundamental objective is to measure, via error-corrective quantum protocols, the dimensionless ratio of successful gate-operations to total gate-operations. We can be confident that our scalable quantum memory/quantum computation technologies are working properly, if and only if this counting-ratio is precisely unity, for arbitrarily large numbers of gate-operations. Conversely of course, the same quantum errors that degrade QMT counting-precision also serve to frustrate demonstrations of Quantum Supremacy.

    In a nutshell, the ECT-compatible effort to classically simulate and experimentally demonstrate QMTs accurate to N-bit precision, at an entropy cost that is polynomial-in-N, is a natural intermediate milestone — a low-risk and technologically valuable milestone — relative to the less-certain, longer-term milestone of either demonstrating Quantum Supremacy or, alternatively, failing for well-understood  ECT-related reasons.

    Quantum Prospects: Counting on Entropy

    For ECT fans (like me), a particularly enthralling aspect of the modern Golden Era of quantum physics and engineering is the focus upon counting. In particular, any description of nature that includes a notion of entropy fundamentally involves the mathematical counting-of-states.

    In the 19th century, Boltzmann showed how to count-the-states of Newtonian-space dynamical systems. In the 20th century, von Neumann showed how to count-the-states of Hilbert-space dynamical systems. Now in the 21st century, we have the challenge of ECT-compatible state-counting on varietal dynamical systems … the mathematical challenge of learning how to do this quantum-varietal state-counting plays a central organizing role (as it seems to me) in making the present research-era “golden”.

    For student-friendly inspiration, the Introduction to Harris and Eisenbud’s textbook 3264 and All That: a Second Course in Algebraic Geometry is commended to Shtetl Optimized readers. And as a mathematical exercise, it’s fun to estimate how many of the ICM 2018 Plenary Lectures are entropically relevant to varietal state-counting … by my count, there are at least eleven such Plenary Lectures. This means, evidently, that even expert mathematicians, nowadays, aren’t sure what entropy-related/counting-related questions we should be asking.

  92. James Gallagher Says:

    What I worry about more right now, is all the people who will say it was completely obvious all along that quantum supremacy was possible, it’s just 1926 QM, so what was the point of doing it?

    Don’t worry Scott, even if that happens, you’ll be in good company, since Bohr and Von Neumann thought lasers were impossible due to the Heisenberg uncertainty principle

    Shortly after we built a second maser and showed that the frequency was indeed remarkably pure, I visited Denmark and saw Niels Bohr, the great physicist and pioneer in the development of quantum mechanics. As we were walking along the street together, he quite naturally asked what I was doing. I described the maser and its performance. “But that is not possible,” he exclaimed. I assured him it was. Similarly, at a cock­tail party in Princeton, New Jersey, the Hungarian mathematician John von Neumann asked what I was working on. After I told him about the maser and the purity of its frequency, he declared, “That can’t be right!” But it was, I replied, and told him it was already demonstrated.

    http://churchandstate.org.uk/2011/06/how-the-laser-happened-adventures-of-a-scientist/

  93. Scott Says:

    Aula #89 and David Speyer #90: Indeed, the next time around I’ll try to head the problem off earlier. The difficulty is that, before the site actually goes down, according to the Bluehost Control Panel everything is fine and there’s nothing for me to do! So my guess was that tech support would see the same false thing.

  94. Joe Shipman Says:

    I see lots of remarks of the form “20 qubits achieved, 70 qubits hard” but I can’t relate to it in the way I could relate to a statement of the form “Shor’s algorithm can implemented to factor N-bit integers TODAY”. Do we have actual performance stats for implementations of Shor’s algorithm for various N between, say, 6 and 60?

  95. JAMES GRABER Says:

    Hi Scott, Another what do you think of this type question:

    Is this surprising (to me) result correct?

    http://iopscience.iop.org/article/10.1088/0256-307X/35/11/110303

    Exact Equivalence between Quantum Adiabatic Algorithm and Quantum Circuit Algorithm

    https://physicsworld.com/a/quantum-adiabatic-and-quantum-circuit-algorithms-are-equivalent-say-physicists/

    Wikipedia in” Superconducting Quantum Computing” to the contrary: More than two thousand superconducting qubits are in a commercial product by D-Wave Systems, however these qubits implement Quantum annealing instead of a universal model of quantum computation.

    Is there a difference between Quantum Annealing and Quantum Adiabatic Algorithm?

  96. Scott Says:

    James #95: I haven’t read the paper, so I don’t know if it’s correct and (just as importantly) whether it makes any assumptions that are different from what I know. Is there a version that’s on the arXiv / not paywalled?

    In any case, it’s crucial to understand that standard and adiabatic QC have long been known to be polynomially equivalent—that’s a celebrated result of Aharonov et al. from 2003. So, this paper would be a tightening of Aharonov et al.

    It’s also crucial to understand that there are several reasons why D-Wave’s current devices couldn’t use such results to simulate arbitrary quantum computations (and indeed, D-Wave has never claimed they could). Most obviously, there’s the noise / nonzero temperature, which is certainly going to cause level crossings, given that the spectral gap in these universality constructions is 1/polynomial rather than constant. (This is the difference between “adiabatic QC” and “quantum annealing.”) But in addition, there’s also the fact that D-Wave has been limited to “stoquastic” Hamiltonians. Those suffice for encoding optimization problems, but non-stoquastic Hamiltonians are known to be necessary for universal QC unless there’s an unlikely inclusion of complexity classes (BQP⊆AM).

  97. JAMES GRABER Says:

    Wow Much more complicated than I realized. Thank you very much for your detailed reply.
    Jim Graber

  98. Gil Kalai Says:

    (I hope this comment meets the comment policy.) Regarding noise stability: I was also pleasantly surprised how the theme came to play in Arora’s ICM plenary lecture on deep learning. Let me mention the notion of robustness for biological systems studied by Naama Barkai and others as potentially related. On the mathematical side here is a link of a talk by Jeremie Brieussel: Noise sensitivity of random walks on groups, who started by mentioning the origin of noise stability/sensitivity in works by Shannon and described a joint result with Itai Benjamini.

    Regarding ICM2018: It was a very nice event and, for the first time, all (or most) lectures are videotaped and are on you tube and this is a great resource. Excellent sessions on combinatorics and mathematics of computer science and also quite a few other related talks. And also connections to quantum information and computation made their appearance in several talks.

    It was a very good ICM for the brand name Kalai. Beside my talk, there was a great talk by Yael Kalai on Delegating computation via no-signaling strategies. On top of that a former recipient of the Kalai prize, Constantinos Daskalakis (Constantinos shared, in fact, the first Kalai prize), was awarded the Nevanlinna prize.