Merry Christmas! My quantum computing research explained, using only the 1000 most common English words

[With special thanks to the Up-Goer Five Text Editor, which was inspired by this xkcd]

I study computers that would work in a different way than any computer that we have today.  These computers would be very small, and they would use facts about the world that are not well known to us from day to day life.  No one has built one of these computers yet—at least, we don’t think they have!—but we can still reason about what they could do for us if we did build them.

How would these new computers work? Well, when you go small enough, you find that, in order to figure out what the chance is that something will happen, you need to both add and take away a whole lot of numbers—one number for each possible way that the thing could happen, in fact. What’s interesting is, this means that the different ways a thing could happen can “kill each other out,” so that the thing never happens at all! I know it sounds weird, but the world of very small things has been known to work that way for almost a hundred years.

So, with the new kind of computer, the idea is to make the different ways each wrong answer could be reached kill each other out (with some of them “pointing” in one direction, some “pointing” in another direction), while the different ways that the right answer could be reached all point in more or less the same direction. If you can get that to happen, then when you finally look at the computer, you’ll find that there’s a very good chance that you’ll see the right answer. And if you don’t see the right answer, then you can just run the computer again until you do.

For some problems—like breaking a big number into its smallest parts (say, 43259 = 181 × 239)—we’ve learned that the new computers would be much, much faster than we think any of today’s computers could ever be. For other problems, however, the new computers don’t look like they’d be faster at all. So a big part of my work is trying to figure out for which problems the new computers would be faster, and for which problems they wouldn’t be.

You might wonder, why is it so hard to build these new computers? Why don’t we have them already? This part is a little hard to explain using the words I’m allowed, but let me try. It turns out that the new computers would very easily break. In fact, if the bits in such a computer were to “get out” in any way—that is, to work themselves into the air in the surrounding room, or whatever—then you could quickly lose everything about the new computer that makes it faster than today’s computers. For this reason, if you’re building the new kind of computer, you have to keep it very, very carefully away from anything that could cause it to lose its state—but then at the same time, you do have to touch the computer, to make it do the steps that will eventually give you the right answer. And no one knows how to do all of this yet. So far, people have only been able to use the new computers for very small checks, like breaking 15 into 3 × 5. But people are working very hard today on figuring out how to do bigger things with the new kind of computer.

In fact, building the new kind of computer is so hard, that some people even believe it won’t be possible! But my answer to them is simple. If it’s not possible, then that’s even more interesting to me than if it is possible! And either way, the only way I know to find out the truth is to try it and see what happens.

Sometimes, people pretend that they already built one of these computers even though they didn’t. Or they say things about what the computers could do that aren’t true. I have to admit that, even though I don’t really enjoy it, I do spend a lot of my time these days writing about why those people are wrong.

Oh, one other thing. Not long from now, it might be possible to build computers that don’t do everything that the new computers could eventually do, but that at least do some of it. Like, maybe we could use nothing but light and mirrors to answer questions that, while not important in and of themselves, are still hard to answer using today’s computers. That would at least show that we can do something that’s hard for today’s computers, and it could be a step along the way to the new computers. Anyway, that’s what a lot of my own work has been about for the past four years or so.

Besides the new kind of computers, I’m also interested in understanding what today’s computers can and can’t do. The biggest open problem about today’s computers could be put this way: if a computer can check an answer to a problem in a short time, then can a computer also find an answer in a short time? Almost all of us think that the answer is no, but no one knows how to show it. Six years ago, another guy and I figured out one of the reasons why this question is so hard to answer: that is, why the ideas that we already know don’t work.

Anyway, I have to go to dinner now. I hope you enjoyed this little piece about the kind of stuff that I work on.

44 Responses to “Merry Christmas! My quantum computing research explained, using only the 1000 most common English words”

  1. Vadim Says:

    John Sidles, meet Up-Goer Five Text Editor.

  2. rrtucci Says:

    That’s the difference between you and me. I don’t need no stinking editor to sound like a 5 year old. Let the D-wave wild rumpus begin.

  3. Pat Says:

    what is the work that “Six years ago, another guy and I figured out one of the reasons why this question is so hard to answer” refers to?

  4. Job Says:

    Scott, in one of your Q&As a while ago i asked a question but didn’t phrase it properly.

    In the spirit of prioritizing effective communication, let me retry.

    NP-Complete problems often ask about the existence of a value in an implicit set. That set might be, for example, the set of all subgraphs of an input graph G (e.g. for sub-graph isomorphism).

    The implicit set is usually derived from an input value, of length n. For example, the problem might receive n numbers and ask whether a subset of those numbers has a specific property (e.g. sums to x).

    We can define similar problems that have much larger implicit sets. My question is, do you think such problems are still in NP or even NP-Complete?

    In more concrete terms, here’s a problem similar to subset sum which has a much larger implicit set:

    Given a number X, and a set of n numbers, S_0:
    – let S_1 be the set of all possible sums of numbers in S_0
    – let S_2 be the set of all possible sums of n or fewer numbers in S_1
    – let S_k be the set of all possible sums of n or fewer numbers in S_[k-1]
    …does S_k (for some constant, pre-defined k) contain X?

    IMO this class of problem, despite having a much larger implicit input set (i.e. greater than 2^n) is still in NP – the certificate is the set of numbers for each set, so no more than n^k.

    What are your thoughts on this?

  5. Scott Says:

    Pat #3:

      what is the work that “Six years ago, another guy and I figured out one of the reasons why this question is so hard to answer” refers to?

    Algebrization.

  6. Scott Says:

    Job #4: Can you please rephrase your question using Up-Goer Five?

    Sorry, just kidding. The answer is that, yes, the type of modifications you’re talking about still leave the problem in NP, and for precisely the reason you say: the witness is still polynomial-size.

  7. Scott Says:

    Vadim #1:

      John Sidles, meet Up-Goer Five Text Editor.

    LOL! Except that, with the Sidles Up-Goer Five Text Editor, you would have to conduct all human communication using only the words “STEM,” “21st-century,” “manifold,” “Thurston,” … 😀

  8. Scott Says:

    Incidentally, I got curious: of the 1000 words I was allowed, how many did I actually use? The answer, by my count, is 232. OK, now back to real work. 🙂

  9. Ronak M Soni Says:

    Is it really fair to use common words in uncommon meanings, like you did ‘bit’ and ‘state?’ Kinda beats the point, no?

  10. Gil Kalai Says:

    Let me try too:

    Many years ago people thought about what computers can do and what computers can check. They had two roads to go: The first was to believe that computers can do everything that computers can check. This could be a great! The second was to believe that computers can not do everything that computers can check. This is not so great but very interesting! But this is what most people believe and try to understand. It will be great to understand it! But even this is very hard.

    A little later people thought about a new kind of computers that can do a lot of things that our real computers can not do. Again they had two roads to go: The first was to believe that these new kind of computers can be built. This could be great! The second was to believe that these new kind of computers can not be built. This would not be so great but again it would be interesting and it would be great to understand the reasons. This time many people took the wrong road and believed that the new kind of computers can be built. They took the wrong road but they gave us a good ride for our money. Maybe they took the wrong road because they believed some guy with beautiful thoughts. But this guy was joking or just wrong.

    Anyway, the road people took was the wrong road and my work is to explain why. My work has three parts. The first part is to explain what was stupid in what people thought: People missed that a computer is different from a car! The second part is to draw a different picture. This is hard and takes a lot of time. The third part is to explain why not to believe in these new computers: One reason is that the new computers are able go back in time in such a way we never saw before.

    All this is very exciting!

  11. Scott Says:

    Ronak #9:

      Is it really fair to use common words in uncommon meanings, like you did ‘bit’ and ‘state?’

    Eh, if you wanted to nitpick, you could also complain about compound forms like “day to day life” and “open problem,” the use of “reason” as a verb, or the use of numerals. But rather than endlessly adjusting the bar, I decided to stick with a simple standard: namely, could I get the Up-Goer Five editor to accept my text, through correct English use of its thousand allowed words? Try it — it’s fun!

  12. Yuval Says:

    Scott #11 and Ronak #9:

    I agree with “reason” as a verb and “state”, but I actually think “bit” is fine. If you think of “bit” in its common definition of “little piece of,” the sentence not only makes sense, but means exactly what it should. It’s a neat coincidence that the more precise, esoteric word has the same spelling and pronunciation — but that doesn’t necessarily mean that using the common definition is wrong.

  13. Reza Says:

    Gil #10:
    “Maybe they took the wrong road because they believed some guy with beautiful thoughts. But this guy was joking or just wrong.”

    Some guy: Do you mean Richard Feynman from Caltech?

  14. Attila Szasz Says:

    I admit my following question is not very fair, since its christmas on one hand, and on the other hand im pretty sure you’re getting lots of annoying questions about the topic, mine probably included; but still i was just wondering wheter you’ve developed some insights regarding this borel-determinacy-stuff approach of Gowers’ in connection with algebraization?
    Do his criteria for the suitable complexity measure have some chances/promises of opening up the blackbox wider, in the sense of being more powerful than just carrying through oracle extensions? (maybe like in 11.2 in your paper, suggesting the ability to capture non-obvious structure)

  15. Scott Says:

    Attila: Sorry, I simply haven’t studied Gowers’s ideas enough to be able to express an informed opinion. I think Gowers would be the first to tell you that

    (a) the analogy between Borel determinacy and circuit complexity was already made, by Sipser in the 1980s, and in that case it led to AC0 lower bounds but not further than that, and

    (b) Gowers has some ideas for how to maybe get beyond AC0, but they’re at an extremely speculative stage right now.

    I really admire Gowers for putting his speculative ideas (clearly labeled as such) out there—not everyone has the courage to do that, and I wish more people did. In return, I think, we should do him the favor of not treating his musings like published papers.

    Now regarding algebrization, I’d say that Ryan Williams’s NEXP⊄ACC result has already convincingly gotten around it, so we know that it’s not some impenetrable black box. Closer to the question at hand, the original PARITY∉AC0 results from the 1980s—the ones vaguely inspired by Borel determinacy—also got around relativization and algebrization, to whatever extent those barriers even make sense at such a low level. The problem was “merely” that those results fell prey to natural proofs. So, despite not understanding Gowers’s ideas, I suppose I’ve arrived at the tentative conclusion that the more relevant question is not how they get around relativization or algebrization, but rather how they get around natural proofs.

  16. Attila Szasz Says:

    Yeah, sorry, thats what I meant by annoying, I totally get that those are rudimentary ideas, and he warns everyone not to do the harmful hype hysteria. I couldn’t resist, but actually, I had a more general question in mind, which is now immediate from your reply, so let me phrase it:
    How much can these barriers be independent of each other?
    I realize that approaching PvsNP from the smaller classes’ direction has the naturalproof barrier ‘by default’, and algebraization is more relevant for bigger class separations. So in the scenario that somehow you manage to be smart say on the level of NEXP vs P/poly, requiring non-algebrizing stuff, it seems reasonable that at this point you hope to improve on your method which ‘automatically’ escapes estimating a function too directly so you can take your shot at P NP. But given this would be a separation result, and you’re also separating from the from-small-classes approach aren’t you in fact bound to show this quite explicitly in the process?
    (Incidentally, by now i expect answers to this more or less either in the form “Dude, no one knows.” or one that points out I’m missing something crucial.)

  17. Rahul Says:

    Tried my hand at Up-Goer Five Text Editor….. 🙂

    The work these people do is saying what are all the great things we could do if we had a great great “new kind of computer”. In real life such a computer is not made yet. No one is even close to making it. Many people pretend we are close to such a computer. But we are not. There are many real problems in making such a computer. Most good people who study them understand this and agree. Other people want money or other things. So they lie and pretend to the world that this “new kind of computer” is right around the corner.

    I think what these people study is like a game. They try to see what’s the best they could do for us within this game. Games are fun. Knowing about our world is also important. But we have to decide how much time, money and attention we give these people. I think we are giving them too much right now. How much money would you give someone to make you sure that this “new kind of computer” can not be built? Maybe we will learn some interesting things along the way. But it might be easier to try to learn them in other ways.

    We have to learn to trust those who are good and do not lie about what their “new kind of computer” can do yet. We must also spend more on trying to actually make this “new kind of computer” and spend less on imagining what sorts of nice things we could do if only we had this “new kind of computer”. Imagine if hundred years before the first car was actually built hundreds of ten hundreds of people were given lots of time and money to tell you what sorts of great and fun things you could do if only you had a car.

    These people have brothers who actually try to make today’s computers do more things and faster by thinking of new ways to do these same things. I think these brothers do important things but they do not get so much attention. That is a little sad.

  18. J Says:

    Let N be a positive integer. Let $M=\lceil\sqrt{N}\rceil$ and $M+1=\lceil\sqrt{N}\rceil + 1$ not divide N.

    Is the problem of deciding whether Z is $H! \mod N$ for some 0<H<M in NP or Quantumn version of NP?

    Can the problem be in Quantum polynomial time?

  19. Scott Says:

    Rahul #17: I find your argument extremely persuasive—assuming, of course, that we’re both talking about Bizarro-World, the place where quantum complexity research commands megabillions and is regularly splashed across magazine covers, while Miley Cyrus’s twerking is studied mostly by a few dozen nerds who can all fit in a seminar room at Dagstuhl.

  20. Rahul Says:

    Scott #19:

    Is “megabillions” the new minimum standard for over-funding a scientific research area?

  21. Rahul Says:

    Scott:

    To be less frivolous either:

    (a) You think it’s axiomatic that no legitimate research area can ever be considered over-funded (till it gets “megabillions”?)? Do you?

    (b) You agree that certain areas may at certain times be over-funded. In which case we can discuss which areas these are. Obviously, the assessment is subjective.

    I’ve no grudge if a rich millionaire were using his own purse to fund pretty much anything, even research on homeopathy or clairvoyance. But if it is public money I think allocation profile is fair game for debate.

    Unless, of course, you think funding money is a non scarce resource. But I doubt that’s your view.

  22. Joshua Zelinsky Says:

    As long as we’re discussing algebraization, R. J. Lipton just had a post http://rjlipton.wordpress.com/2013/12/26/re-gifting-an-old-theorem/ about how every circuit has a corresponding circuit with the same behavior but the second circuit uses very few not gates and is only slightly larger than the first circuit. I wonder if something like this can be used to get past the algebraization barrier since the second circuit’s small number of not gates could act effectively as a restriction for what the circuit’s algebraized form has to look like so that they can’t just be generic polynomials.

  23. Fred Says:

    Hi Scott,
    when you say “If it’s not possible, then that’s even more interesting to me than if it is possible!”

    Would it still be really that interesting to you if the impossibility was purely from an engineering/economics perspective?

    For example if the cost of developing a practical quantum computer was prohibitively high for the practical benefits it would bring. Imagine that developing a non trivial QC would be on the scale of building the LHC – that’s clearly too costly if we can’t show beforehand that it could be used for something better than factoring large numbers or quantum annealing.

    But maybe we’re still at the stage where the engineering limitations aren’t clear enough? For the case of using nuclear fusion as a practical source of energy it seemed for a while that maybe it could be done cheaply with “cold fusion” and whatnot (but in that case the huge costs of trying to build an effective “Tokamak” reactor clearly don’t outweigh the potential long term benefits).

  24. Scott Says:

    Fred #23: What you’re talking about is not QC being impossible, in the sense that skeptics like Gil Kalai (see above), Leonid Levin, Michel Dyakonov, or Robert Alicki mean by “impossible.” (To ward off precisely this confusion, I usually say something like “fundamentally impossible” or “impossible in principle,” but those concepts are hard to express in Up-Goer Five. 🙂 ) You’re just talking about QC being sufficiently hard that people decide to give up. Yes, of course that would kind of suck. But at least we would’ve learned a lot about quantum mechanics and computation, and there would’ve been other spinoffs for science and technology (there always are, and there have already been some).

    In my personal opinion, proving that QC can work would be at least as interesting—purely for fundamental physics, setting aside all applications!—as the discovery of the Higgs boson was. And our civilization (OK, mostly the EU) decided that finding the Higgs boson was worth $11 billion. It’s hard for me to understand how people can simultaneously believe that that was justified, and that spending a comparable amount to prove the reality of beyond-classical computational power in our universe wouldn’t be. But maybe the Rahuls of this world were also against the LHC, in which case their position would at least be consistent. (Incidentally, other projects that strike me as LHC-complete include a sample-return mission to Mars, and cool space-based physics like gravity-wave detectors at the Lagrange points of the solar system.)

    Now regarding nuclear fusion, I think the real tragedy is that essentially everything that’s hoped for from fusion energy, can already be had right now from modern fission reactors. The reasons why we don’t get almost all of our power from fission are probably less technological than they are social, political, and psychological. (E.g., we’re unwilling to accept meltdowns that occasionally kill or poison dozens of people, even though we do accept fossil-fuel use that’s destroying the entire planet.)

  25. Gil Kalai Says:

    The very first words of Aram Harrow in our debate regarding quantum computers were:

    “There are many reasons why quantum computers may never be built. We may fail to bring the raw error rate below the required threshold while controlling and interacting large numbers of qubits. We may find it technically possible to do this, but too costly to justify the economic value of the algorithms we are able to implement. We might even find an efficient classical simulation of quantum mechanics, or efficient classical algorithms for problems such as factoring. The one thing I am confident of is that we are unlikely to find any obstacle in principle to building a quantum computer.”

    Aram’s sentiment is very similar to Scott’s comment #24. On this matter I disagree with Aram and Scott. One thing I am confident about is that if quantum computers will never be built then we will be able to understand the principle behind it. We will be able to find the principle (or the appropriate mathematical modeling) for it if it is a fundamental physics principle, and we will be able to find the principle (or the appropriate mathematical modeling) if it is “just” an engineering matter. As a mathematician, the challenge is to find the appropriate modeling, and it almost makes no difference one way or the other. (OK, of course it will be more pleasing to stumble upon something fundamental, but the mathematical challenge and interest is separate from this matter.)

    As Scott wrote, indeed I regard quantum computers as implausible and I think this is the way things will turned out, but as a mathematician I find the question of how to model quantum evolutions without quantum fault-tolerance fascinating whether it represents the border between reality and fantasy or if it represents the border between present and future technology. Whether you are skeptics or a believer, an explanation is needed for why nature makes quantum fault tolerance and universal quantum computation so really hard.

    As an aside let me comment that the problem with the terms “obstacle in principle,” “fundamentally impossible” or “impossible in principle,” is not just that they are not expressed in Up-Goer Five. A more serious problem is that they don’t have a definite formal meaning.

    .

  26. Raoul Ohio Says:

    1. I am not a fission power hater, but FE does have a LOT of problems, and certainly will not fulfill the “clean, too cheap to meter” fusion hype from the last 50+ years. The sad reality is that all power sources have a lot of problems, and are usually limited.

    2. I have no doubt that 100’s or 1000’s of scientists are sure that their specialty could do great stuff with $11G or so from the EU. But I think there is widespread agreement that physics looks into fundamental stuff that is at the bottom level. This kind of boils down to: “Are the QCD rules correct? Complete? The LHC is not a “find the Higgs” machine, it is the biggest microscope yet, to see what is there, and if the rules apply. The first thing the LHC has found might be the Higgs. It also might not.

  27. Fred Says:

    One of the difficulty in this debate is that it’s not clear what sort of minimum quantum circuit would clearly qualify as a computer (a custom quantum circuit that factorizes 15=5×3 is more a calculator).

    Early classical computers were limited (laughable by today’s standards) but still immediately useful as general-purpose computing systems (with an emphasis on programmability).
    The improvement of the hardware has been so amazing that we tend to forget that classical computers are limited in many ways – minimum energy thresholds, the speed of light, quantum noise (transistors leak and erode, etc), miniaturization (at best one transistor == one atom).
    Those aren’t just engineering difficulties but inherent theoretical limits.
    Today those limitations are pushing back in the other direction (no more free lunch) forcing software engineers to be aware of the underlying hardware (break of abstraction) and try to rewrite their code to take advantage of hundreds of CPU cores, etc (we see a resurgence of functional programming languages, which makes parallelization somewhat less painful).

    QC seems trickier because the approach is bottom-up from the start, using particle state to store qbits, rather than using large clumps of matter to store bits, then refining little by little.
    But in the end I wouldn’t be surprised if the two technological efforts meet, hitting the same physical limitations and solutions.

  28. GDS Says:

    tl;dr

    Now we need a tweet editor to reduce it to <=140 characters.

  29. GASARCH Says:

    Your post seems to imply that QC gives speedup for factoring but not much else. When I made the same claim in a recent post I was refered to

    Quantum Algorithms for algebraic problems
    by childs and dam

    which has MANY algebraic problems with super-poly speedup.
    So— are there really MANY algebraic problem s with super-poly speedup? I was also told that quantum simulations are also an application of QC (I HAD mentioned this in my blog,
    but only briefly).

    Not sure I have a question here- just want you to comment on these other apps.

  30. Scott Says:

    GASARCH #29: Yes, there are quantum speedups for lots of number-theoretic, algebraic, and (abelian) group-theoretic problems other than factoring. Which is a difficult fact to express in the Up-Goer Five editor.

  31. Mark Thomas Says:

    Nice and clear explanation. A question: Could a subtle gravitational wave passing through the quantum computer decohere the superpostion states?.

  32. Scott Says:

    Mark Thomas #31: In principle, sure. In practice, I’m told that gravitational decoherence is so many orders of magnitude smaller than other, more “prosaic” decoherence sources that it’s never anything to worry about. (And of course, if the failure of your quantum computer to work led instead to the first direct experimental detection of gravity waves, I believe that would count as an experimental success! 😀 )

    Here I’m not talking about the exotic speculations of e.g. Roger Penrose, according to which gravitational effects would ultimately change quantum mechanics itself. If true, that could certainly affect quantum computing, but so far there’s not the slightest evidence for it. (Again, though, another reason to care about quantum computing is that, if there were anything that modified quantum mechanics, we’d like to push QM to its furthest limits and discover what that something is.)

  33. Gil Kalai Says:

    I certainly support making large investments in trying to demonstrate (or refute) beyond-classical computational power.

    In my opinion, it is a mistake to identify “scientific value” with “attention” and with “resources”. Both resources and attention (of various forms) are instruments rather than targets in science, and they depend (and should depend) on various other factors.

  34. Rahul Says:

    @Gil

    Can you expand on what you mean by your comment on value, attention & resources?

    It’s intriguing but I may not be understanding it fully.

  35. Gil Kalai Says:

    Rahul, when it come to “resources,” beside “scientific value” a crucial ingredient is to what extent and in what way resources are useful to advance the project. When it comes to attention there is a vast difference between various scientific projects in the way they appeal to large audiences (within and outside science).

  36. Fred Says:

    This new article quotes Scott, but I’m not sure if the quote is new
    http://m.washingtonpost.com/world/national-security/nsa-seeks-to-build-quantum-computer-that-could-crack-most-types-of-encryption/2014/01/02/8fff297e-7195-11e3-8def-a33011492df2_story.html

  37. Scott Says:

    Fred #36: Yeah, they interviewed me. The article itself is basically fine (except that, in the paragraph about “how QC works,” I wish it had said something about interference of amplitudes, rather than “avoiding unnecessary calculations.”). But the comments on the article just spoiled my day. They feature a guy who says that, since the article didn’t explain to his personal satisfaction how a QC would work, the idea must be bunk; a troll who replies to every comment referring to QC in future tense by saying that QCs already exist, and linking to the Wikipedia article about D-Wave as proof … OK, let me stop before I get even more depressed.

  38. Fred Says:

    Scott #37: haha, yeah, internet comments… you pretty much have to ignore them.

  39. Jr Says:

    Fred #38: With certain exceptions of course.

  40. Gil Kalai Says:

    All along, my goals in my skeptical pursuit of quantum computers were to understand the matter and also what other people thoughts are, and to describe via mathematical modeling and reasoning an alternate theory of quantum noise – a theory that will draw the “quantum fault-tolerance barrier,” explain why computationally superior quantum computers are impossible, and will have further major consequences for quantum physics as a whole.

    Looking at it now, from the perspective of this post (and #10), it looks that I was selling myself short. My newly expanded goals would be, in addition, to have all these eleven words: smoothed, Lindblad, evolution, trivial, flaw, detrimental, quantum , noise, spontaneous, error, and synchronization, making it to the list of 1000 most used English words! 🙂

    It was nice, Scott, to meet you again in Jerusalem, Thanks for your excellent lectures.

  41. Eric Cordian Says:

    Does the AdS/CFT correspondence, in which gravity in anti-de Sitter space is equivalent to a Yang-Mills quantum theory on the boundary, offer any insights as to whether quantum computing is actually more powerful than classical computing?

    Could this correspondence be used to map a quantum computer on the boundary into a classical computer on the higher dimensional curved multi-connected manifold?

  42. Scott Says:

    Eric Cordian #41: Good questions! I posed your questions to Daniel Harlow, a quantum gravity expert of my acquaintance, and he replied as follows:

      1) The bulk theory is also quantum, so the duality is between two quantum theories. The bulk theory is classical in a certain limit, but just in the usual way that the standard model is classical in a certain limit. The only nontrivial point is that the classical limit is one that you wouldn’t necessarily want to call classical from the boundary CFT point of view.

      2) The super Yang-Mills theory should be efficiently emulatable on an ordinary quantum computer, with overhead which is some low order polynomial in N (the N of SU(N)). This hasn’t been shown rigorously, but I expect it to be the case.
      I can’t be sure that there isn’t some crazy kind of algorithm involving throwing stuff into black holes, etc, but these two points make me think that AdS/CFT is unlikely to say much about the structure of complexity theory.

  43. Digressions loosely related to Scott Aaronson | The Daily Pochemuchka Says:

    […] Here are three short essays on quantum computing explained to a lay audience using only the ten hundred most commonly used words in English: […]

  44. My research explained, using only the 1000 most common English words | Short, Fat Matrices Says:

    […] by Scott, Afonso and Joel, using the Up-Goer Five Text Editor, which in turn was inspired by this xkcd. I […]