Quantum Computing Since Democritus Lecture 5: Paleocomplexity

From my inbox:

We simple folk out in the cold wastes of the internet are dying the slow and horrible death of intellectual starvation. Only you can save us, by posting the next installment of your lecture notes before we shuffle off this mortal coil. Will you help us, or will you say “Let them read slashdot”? Ok, seriously, I know you’re busy. Just wanted to make sure you knew people are enjoying the lecture notes.

And from my comments section:

You know you’ve made it, and then lost it, when you no longer publish notes on your course :)

Alright, alright, alright, alright, alright. Now that I’ve returned from my two-week world concert tour (which took me to Innsbruck, London, Yale, and U. of Toronto), and now that my girlfriend and I have settled into a lovely new apartment (complete with silverware, shower curtains, and a giant poster of complexity class inclusions above the fireplace), I finally have some time to resume your regularly-scheduled programming.

So won’t you join me, as I attempt to excavate the strange forgotten world of paleocomplexity, and relive an age when STOC and FOCS were held in caves and Diagonalosaurs ruled the earth?

42 Responses to “Quantum Computing Since Democritus Lecture 5: Paleocomplexity”

  1. Cheshire Cat Says:

    The Diagonalosaurs merely went underground, and will soon render Mt. Razborudich extinct.

  2. mick Says:

    Where can you get a complexity class poster? I think my apartment desperately needs one!

  3. scott Says:

    My friend Alex gave me the poster as a present. Here are some instructions for making your own:

    (1) Start with this inclusion diagram (or the earlier one by Chad Brebaker).

    (2) Print it out on a giant posterboard!

  4. mick Says:

    That’s hot!

  5. Anonymous Says:

    If I had to guess without any prior information which of the three items (silverware, shower curtains, or complexity class poster) you had in your apartment, I would have guessed the poster first.

  6. Preston Sturgeon Says:

    Your girlfriend should not be living with you out of wedlock. Such an arrangement weakens the basic structure of the family, which is important for human development.

  7. Anonymous Says:

    Scott, I want to request a special lecture on the Renormalization Grouparus, that dinosaur from the late Cenozoic era that so enthralls physicists and other junior complexitologists. Is it true that Renormalization Grouparus had feathers and is thought to be the common ancestor of some of the meanest dinosaurs? This is a favorite question asked by tykes who are future physicists, when they visit the dinosaur exhibit at the complexitology museum.

    rrtucci (a physicist)

  8. Anonymous Says:

    correction: Grouparus->Groupsaurus

  9. Greg Kuperberg Says:

    Thanks for the credit for that!

    Is there a picture of said girlfriend?

  10. scott Says:

    cheshire cat: How do dinoaurs render a mountain extinct?

  11. scott Says:

    rrtucci: Sorry, no Renormalization Groupsaurus has ever been unearthed in the areas where I do my digs. However, my colleagues in Physwanaland tell me that R. Groupsaurus had terrifying fangs and a nauseating stench. Doubts remain, though, about whether R. Groupsaurus ever actually existed, or whether the alleged Groupsaurus specimens are just deformed juveniles of the hypothetical Planckosaur.

  12. Anonymous Says:

    “modulo an unsolved problem in analysis”

    can you give a reference for the lazy?

  13. scott Says:

    Greg: Here’s Kelly’s homepage.

  14. Greg Kuperberg Says:

    She looks really good, Scott. (But does she always wear a glass like that?) I cannot think of a better antidote than this to the dreary “Why Nerds are Unpopular” perspective.

    I hope it is not too politically incorrect to first comment on appearance. I followed the link to her paper and that looks good too.

  15. Scott Says:

    “modulo an unsolved problem in analysis”

    can you give a reference for the lazy?

    Sure. Try this article from Notices of the AMS.

  16. Osias Says:

    > By any objective standard, the theory of computational complexity
    > ranks as one of the greatest intellectual achievements of humankind
    > — along with fire, the wheel, and computability theory

    entirely agree!

    > On the computers of the 1950′s, it was
    > hopeless ^hopeless^…^hopeless.

    Scott, your are the man! :D

    > there’s a function f such that TIME(f(n))=TIME(2^f(n))

    [mode Luke Skywalker on]
    [mode Luke Skywalker off]

    > Blum Speedup Theorem says that there really are problems that
    > admit no fastest algorithm

    [mode Luke Skywalker on]
    [mode Luke Skywalker off]

    But, besides the function from the example, is there any “natural” example? I mean, is already proved multiplying have no fastar alg?

    PS: Itakura is nice! And her research seems interesting to me. Doesn’t she have a blog like this?

  17. Scott Says:

    But, besides the function from the example, is there any “natural” example?

    osias: No, to my knowledge we don’t yet have any natural example of a problem proven to have no fastest algorithm. Interestingly, we do have nontrivial examples in the other direction: the theory of instance-checking tells us that Permanent and PSPACE-complete problems have fastest algorithms (up to a polynomial factor), even though we have no idea how fast those algorithms are.

  18. Scott Says:

    PS: Itakura is nice! And her research seems interesting to me. Doesn’t she have a blog like this?

    Why don’t you start one, Kelly? They seem to like you…

  19. Anonymous Says:

    RE puzzle 2, water and pain:
    It seems to me that Kripke’s “rigid designator” view works for pain as well as water. Now H2O water and analogously C-fibre firings pain, but if we discover X-fibre => pain and we are forced to accept C-fibre => pain, then we are forced to examine what “pain” means. If we say “pain” means certain sensations that cause certain behaviours (eg, emotional and physiological changes), then we have conceded to the Frege-Russel view analogous with water when we say “water is some collection of molocules that exhibit the behaviour of wetness, freezing point etc”. If, however, we see the neuronal “physical” layer as being only a medium for generating a specific firing pattern “sfp” where sfp pain, then we have a valid Kripke view again. So it seems like almost anything can be held in a Kripke view given sufficient understanding of it’s nature, and there is no need for a “disanalogy”.

  20. Scott Says:

    Anonymous: While I’m personally a hydro-agnostic, let me try to defend Kripke just for fun. For him, the big difference is that water causes certain experiences (like being wet), whereas pain is an experience. So while something that’s not water could produce the same experiences as water, something that’s not pain couldn’t be the same experience as pain (since if it was the same experience, then it would be pain by definition). I.e. Kripke’s claim is that consciousness has a variable for pain, but only a pointer for water. :)

  21. Anonymous Says:

    Well, for every function f(n), TIME(f(n)) is contained in SPACE(f(n)). Why? Because a computer can access at most one memory location per time step.

    This argument as decribed applies only to Turing Machines. Exercise for the reader: construct an argument that applies to real computers i.e. Random Access Machines (RAM).

  22. Cheshire Cat Says:

    They are not dinosaurs, they are dragons. What you thought was the volcano erupting was merely the Diagonalosaurs breathing fire, enraged by their mortal enemies, the Combinataurs. Now that the Combinataurs are vanquished, the Diagonalosaurs can breathe easy, and the volcano will not “erupt” again…

    It’s all quite simple, really.

  23. Anonymous Says:

    Now, every function you’ll ever encounter in civilian life will be time-constructible.

    Really??? Do you have a reference for this, for example, a citation for a paper showing that all basic arithmetic functions including sin, cos, sqrt, log are time constructible?

  24. Scott Says:

    sqrt and log are obviously time-constructible. As for sin and cos, it only makes sense to ask about time-constructibility of functions f that grow at least as fast as log(n), since otherwise the f(n)-time machine can’t even read n. That caveat aside, I stand by my claim: the only non-time-constructible functions we know are the ones explicitly constructed by diagonalization. See Papadimitriou or any other intro text for details.

  25. rdv Says:

    He claims that it was a well-known “fun” problem even when he was young, but I first learned of the “write a program that can print itself” problem from Ken Thompson’s Turing Award lecture.

  26. Owain Evans Says:

    As far as I know, philosophers who follow Kripke would say that “water is H20″ is a necessary truth, rather than a tautology.

    A sentence of the form “A or not A” (where A is itself a sentence) is a tautology, because the sentence is true regardless of the meaning of the sentence A. For example, if A=’The sky is green, then we get “The sky is green or the sky is not green”, which is true.

    What form does “water is H20″ have? The standard way to represent the sentence in first-order logic is with the form “a=b”, where ‘=’ is the identity predicate, and a and b are constants which can be interpreted as any objects. But “a=b” is not a tautology. If we take ‘a’ to mean Peter Parker, and ‘b’ to mean Superman, then we get “Peter Parker is Superman”, which is false. Of course, when we take ‘a’ to mean water and ‘b’ to mean H20 we get a true sentence, but we don’t get true sentences regardless of the meaning of ‘a’ and ‘b’.

    “Water is H20″ is not tautologically true, but it is (for Kripke) necessarily true, i.e. true in all possible worlds. Kripke thinks that when we say something like “a=b”, we are pointing out that the constants ‘a’ and ‘b’ pick out a single entity. Thus we think that the words “water” and ‘H20″ are names for the same entity. Consider a possible world where the words “water” and “H20″ have the same meaning as they do for us (so “water” does not mean chocolate ice-cream, etc.). In this world, assuming the entity that we call water or H20 exists, the sentence “water is H20″ must be true. If it weren’t true, then the single entity that the words “water” and ‘H20″ pick out would not be itself, which is absurd (nearly as absurd as saying that in some possible world “water is not water” or “H20 is not H20″). So “water is H20″ is true in all possible worlds (worlds that are different from ours but where words have the same meanings), but it is not true in all possible assignments of meaning to the terms in it (e.g. it’s false if “water” means Peter Parker, and “H20″ means Superman).

    [In short, the above distinction is between logical truth and necessary truth. All philosophers think that logical truths are necessary (i.e true in all possible worlds), but some think that there are some necessary truths that aren't just logical truths.]

    Questions: Are there faster algorithms for any fragments of the theory of rings over the real numbers? Can the slow algorithms be of any use in deciding the truth of short formulas involving the real numbers? Are there bits of math for which faster algorithms exist (and if so have computers been of any use in mathematical research in those areas)? Will quantum computing have any special impact on mathematical research (impact you wouldn’t get from just having faster computers–which are obviously useful in various areas of mathematical research)?

  27. scott Says:

    owain: Thanks for the terminological correction!

    Answers to your questions: (1) yes, (2) yes, (3) yes, and (4) I don’t know.

  28. Anonymous Says:

    I’m confused. When you say “unsolved problem in analysis” are you referring to Schanuel’s conjecture?

  29. Greg Kuperberg Says:

    My answer to Owain’s question 4: I would suppose so!

  30. scott Says:

    When you say “unsolved problem in analysis” are you referring to Schanuel’s conjecture?


  31. Anonymous Says:

    As for sin and cos, it only makes sense to ask about time-constructibility of functions f that grow at least as fast as log(n), since otherwise the f(n)-time machine can’t even read n.

    Obviously, what was meant here is algebraic combinations thereof, with the proper ceiling and floors sprinkled all over, e.g. floor(n*(sin(n)+sqrt(7))).

    Do we know such combinations to be time constructible? It would seem to me they are likely not. Space constructibility on the other hand seems almost assured.

  32. John Sidles Says:

    For Young Frankenstein fans only …

    Here’s a version of the complexity class poster that expresses how some folks feel (like me for example) when contemplating the “complexity of complexity”!

    Note: this PDF file is editable in Adobe Illustrator … please feel free to customize it to suit your own taste.

  33. scott Says:

    floor(n*(sin(n)+sqrt(7))) is certainly time-constructible…

  34. Anonymous Says:

    floor(n*(sin(n)+sqrt(7))) is certainly time-constructible…

    Which implies you need to compute a good enough approximation to the function f(n) within time f(n). In other words, f(n) is in computable in some sort of exponential time on the size of the output. Why is this so obviously true? It seems to me that the world is full of things that are harder to compute than that.

  35. john Says:

    Why is it immediate that once you define integers (e^{pi * i}…) the decision problem becomes undecidable? As I understand it Godel’s theorem says that there are true statements that any sound proof system can’t prove. However isn’t a program that decides this problem free to find a sufficiently powerful (& sound) proof system in which the statement is provable?

  36. Chris Dawson Says:

    Strange, the complexity class inclusion graph looks very familiar. Is it isomorphic to the London underground?

  37. John Sidles Says:

    Whoop! The Young Frankenstein post had the wrong URL; here’s the correct one. (no, the wrong one wasn’t some kind of lame recursive joke) :)

  38. scott Says:

    floor(n*(sin(n)+sqrt(7))) is certainly time-constructible…

    Which implies you need to compute a good enough approximation to the function f(n) within time f(n).

    No. When I said you have to run for f(n) steps, really I meant O(f(n)) steps. (That’s perfectly sufficient for proving a hierarchy theorem.)

  39. scott Says:

    John: What I meant is that there’s no general algorithm that, given as input a statement about complex numbers with exponentiation, decides whether or not the statement is true. This follows from the fact that such sentences can talk about integers, and if you can talk about integers you can talk about Turing machines, and if you can talk about Turing machines you can formulate both Gödel sentences and the halting problem.

    (Unfortunately, the word “undecidable” has several senses — there’s Gödel’s sense of a theory that can formulate its own consistency statement, and there’s Turing’s sense of an algorithmically unsolvable problem. But I hope the above discussion clarifies what I meant.)

  40. Greg Kuperberg Says:

    At the risk of a bit of narcissism, I would enjoy a photo of the poster in its environment. (The inclusion graph is the fruit of a serious computer-assisted project.)

  41. David Moles Says:

    Well, once we have complex numbers, we can force a number n to be an integer, by saying that we want e^(2πin) to equal 1.

    I’m sure this is just a humanities major making a dumb arithmetic mistake, but could you point me to a fuller explanation of this? When I tried to solve for n I got -1/2.

  42. Scott Says:

    David: If n=-1/2, then e^(2πin) = e^(-πi) = 1/-1 = -1.

    e^(2πin) will be -1 if n = 1/2, 3/2, … (or -1/2, -3/2, …), while e^(2πin) will be 1 if n = 0, 1, 2, … (or -1, -2, …).