Why complexity is better than cannabis

Whether there exist subexponential-size locally decodable codes, and sub-nε-communication private information retrieval (PIR) protocols, have been major open problems for a decade. A new preprint by Sergey Yekhanin reveals that both of these questions hinge on — wait for this — whether or not there are infinitely many Mersenne primes. By using the fact (discovered a month ago) that 232,582,657-1 is prime, Yekhanin can already give a 3-server PIR protocol with communication complexity O(n1/32,582,658), improving the previous bound of O(n1/5.25). Duuuuuude. If you’ve ever wondered what it is that motivates complexity theorists, roll this one up and smoke it.

35 Responses to “Why complexity is better than cannabis”

  1. Anonymous Says:

    Lance had a post about Mersenne primes in January. People might find it interesting.

  2. Osias Says:

    I could say I read complexity texts to avoid insomnia, but it could offend the authors, so…

  3. Anonymous Says:

    I still say cannabis wins.

  4. Anonymous Says:

    I heard this result is illegal in some countries :)

  5. Joseph Says:

    Speaking as a member of the control group of the drug-experimentation generation, maybe I should stop reading this blog.

  6. Anonymous Says:

    I am not sure it is relevant to the comparison with kannabis, but can you tell us something about this truly amazing result? (Is it correct?)

  7. Scott Says:

    I haven’t verified the result, but it seems likely to be correct.

    The main things you need to know are the following:

    A k-query locally decodable code (LDC) is an error-correcting code with the property that, if the receiver only wants to know 1 bit of the decoded message (say, the 473rd bit), then she can learn it by examining only k bits of the encoded message that was sent. Here k is a constant independent of the length of the message.

    We know that 2-query locally decodable codes can exist, provided the encoded message is exponentially longer than the original one. We also know that you can’t do any better with 2 queries. But whether you can do better with 3 queries has been an open problem for a decade. This is the question that Yekhanin gives an affirmative answer to, assuming there are arbitrarily large Mersenne primes.

    Closely related to locally-decodable codes are private information retrieval (PIR) protocols. Imagine a movie download service with n movies. Then we’d like this service to be able to send you a movie i of your choice, but without gaining any information about which of the n movies you’re interested in! (That’s the privacy requirement.) Information-theoretically (that is, with no computational assumptions), it’s easy to see that this is only possible if the download service sends you all n movies, so that you can then watch whichever one you want.

    But what if there are 2 or more identical download servers, which are guaranteed not to communicate with each other? In that case, it’s known that you can do better by having the servers send you information in a subtle correlated way. For example, there exists a 2-server protocol in which you get the movie you want, neither server learns which movie you want, and the amount of information sent to you is the equivalent of only n^{1/3} movies.

    Throw in a third or fourth server, and the amount of information goes down to n^c movies for smaller values of c. But can the amount of information be less than n raised to any fixed power — for example, n to the 1/loglog(n) power? This is another question that Yekhanin gives an affirmative answer to, again assuming that there are infinitely many Mersenne primes.

  8. Scott Says:

    Speaking as a member of the control group of the drug-experimentation generation, maybe I should stop reading this blog.

    Chill out, dude. I said complexity is better than cannabis…

  9. Osias Says:

    I don’t get why the LB went from O(n1/5.25) to O(n1/32,582,658). I thought it should come from the previously discovered M-prime, a lot bigger than 5.25.

  10. Scott Says:

    I don’t understand your question. Firstly, it’s an upper bound, not a lower bound…

  11. Anonymous Says:

    I don’t get why the LB went from O(n1/5.25) to O(n1/32,582,658). I thought it should come from the previously discovered M-prime, a lot bigger than 5.25.

    No-one knew the problem was connected with Mersenne primes before.

  12. Osias Says:

    >> it’s an upper bound,
    >> not a lower bound…

    OOppps, my mistake…

    > No-one knew the problem was
    > connected with Mersenne
    > primes before.

    Thanks.

  13. ano Says:

    There’s no good weed in Waterloo… probably easier to find a good complexity theorist than a good drug dealer!

    :)

  14. B. Prendergast Says:

    Drugs are funny. It is amusing to see people who are addicted. The irreversible effects of drugs on the neurological system are especially hilarious.

  15. A little night musing Says:

    //No-one knew the problem was connected with Mersenne primes before.//

    So is proving that there is an infinite number of Mersenne primes now a more interesting thing to do?

    Hmmm…

  16. Anonymous Says:

    Drugs are funny. It is amusing to see people who are addicted.

    I think we can make jokes about marijuana because it is not “addictive” in the sense that cocaine, nicotine, alcohol, caffiene, etc. are.

    Of course, I may have totally misinterpreted your post as sarcastic. More egregiously though, I may have started a drug debate on a complexity blog. Or maybe that was started by the author of “Why complexity is better than cannabis.”

    But I’m probably just attempting to stealthly solicit opinions from complexity theorists on anti-marijuana laws.

  17. Bram Cohen Says:

    Not to be a party pooper or anything, but n^(1/31) is so phenomenally small that any larger mersenne primes are of no practical significance.

    I was even able to find 31 by hand.

  18. Anonymous Says:

    Complexity theory is funny. It is amusing to see people who are addicted. The irreversible effects of complexity theory on the neurological system are especially hilarious.

  19. scott Says:

    Braithwaite: Are jokes about alcohol (probably the #1 killer drug) also verboten?

  20. Anonymous Says:

    #1 killer is tobacco i think

  21. Anonymous Says:

    Tobacco is the #1 killer by a mile. It isn’t terribly clear that cannabis or caffeine cause any deaths, ever.

  22. scott Says:

    Once we factor in drunk driving accidents, does alcohol or tobacco account for more years of life lost? (I realize such things are hard to quantify.)

  23. Anonymous Says:

    Tobacco knocks several years off the lives of people who smoke regularly. To get an equivalent amount of damage from drunk driving deaths you’d need about one in twenty people to die in a car accident.

  24. A little night musing Says:

    It may or may not be true that complexity theory is better than cannbis, as Scott’s title claims, but these comments seem to demonstrate that thinking about complexity theory is definitely much more fun than thinking about cannabis (and other drugs).

    I’m still getting a contact high off this article.

  25. Douglas Knight Says:

    Tobacco kills an order of magnitude more people than drunk driving, but drunk drivers are young. If their victims are the same age (which is probably reasonable–passengers and people travelling at night), then it takes off an order of magnitude more years. The race is pretty close in the US, based on extrapolations from incomparable numbers I got from wikipedia. I would guess that tobacco easily wins in Europe.

  26. Douglas Knight Says:

    and the car fatality rate in the US was a lot higher a few decades ago. I’d guess that alcohol used to win.

  27. scott Says:

    Douglas: Yeah, that’s exactly what I was thinking.

  28. John Sidles Says:

    Isn’t addiction to ignorance more common, more enjoyable–and in the final analysis, more destructive–than substance addiction?

    Yet ignorance-dependence is remarkably well-tolerated in almost all human societies … especially academia!

  29. Anonymous Says:

    Ignorance is a broad, general, abstract word. Everyone is ignorant of a great many things. I am ignorant of what is occurring now at the north pole of the planet Neptune. I am also ignorant of what is occurring in the leaves near the shed in my neighbor’s backyard. To say “addiction to ignorance” is to say a phrase that conveys very little meaning.

  30. Greg Kuperberg Says:

    If complexity is better than weed, let me throw out this question (maybe better for the “annoying questions” list):

    I get the sense that if there were a classical algorithm for Simon’s problem (HSP in (Z/2)^n) or Shor’s problem (HSP in Z), then it would be able to “learn too much” in some sense. Can we conclude that if one of these problems is in BPP, which is implied by BQP = BPP, then some other dire thing happens, such as: One-way functions do not exist, or SZK = BPP, etc.?

  31. scott Says:

    Greg: Even if you’re more generous and assume BPP=BQP itself, we don’t know how to prove any drastic consequences like BPP=SZK or PH collapses.

    The problem is this: BPP=BQP doesn’t imply that classical computers can solve Simon’s problem; it implies that they can solve any explicit instantiation of Simon’s problem. And we don’t know that many explicit instantiations of Simon or Shor-like problems — basically, we have the ones derived from explicit abelian groups (like factoring and discrete log, as well as elliptic curve problems). And these problems are too specific to derive general complexity conclusions from their tractability. They’re not complete for any reasonable class, and they don’t even seem to be needed for public-key cryptography (as witnessed e.g. by Oded’s system).

  32. John Sidles Says:

    Anonymous sez: Everyone is ignorant of a great many things … to say “addiction to ignorance” is to say a phrase that conveys very little meaning.

    Hmmm … the vocabulary for discussing agnotology is still evolving. So maybe “ignorance-dependent” is a better phrase?

    The point being, that information theory and complexity theory are confirming that there is much more knowledge out there than individual minds can assimilate (a point insufficiently emphasized in the undergraduate curriculum).

    This includes not only an exponentially large body of contingent knowledge (the kind of knowledge that Rutherford famously dismissed in saying “All science is either physics or stamp collecting.”), but uniquely to the 21st Century, an exponentially large body of a priori knowledge (mathematical theorems), and finally, an exponentially large body of fundamental physical laws (the “landscape”).

    Needless to say, this 21st Century informatic challenge to the traditional reductionist picture of science and mathematics raises important new issues and research opportunities in … complexity and information theory!

    As a practical example, how much complexity can an open market tolerate before that market no longer provides an enabling platform for individual freedom, social equity, and efficient enterprise?

    Most generally, how can the foundations of the Enlightenment survive the onslaught of 21st Century complexity? These foundations being (historically) that “The ideals of equality (sexual and racial), democracy, individual liberty, and a comprehensive toleration [] are concretely superior in terms of reason and moral equity” to the “nightmare world apt to result from enshrining as a new set of privileged and prevailing values ‘difference,’ a thoroughgoing relativism, and the doctrine that all sets of values, even the most questionable kinds of theology, are ultimately equivalent.”

    So an emerging central issue of our times (which is really an old issue), namely “who do you trust with knowledge?” is acquiring important new aspects and depths from complexity theory and information theory. It is becoming practically equivalent to, how will 21st Century knowledge be held, shared, and applied?

    And to loop back to the start, perhaps the only human pleasure more comforting, and more universally accessible, than a drink, a smoke, or a cup of coffee in the morning, is a wholly unjustified confidence in one’s own choice of subjects of which to be ignorant. A pleasure that I embrace as fervently as anyone!

    But ignorance-dependence, like alcohol-dependence and substance-dependence, is in the final analysis nothing to joke about. A planet with 10 billion people on it cannot afford to “hit bottom” before acknowledging that it has an ignorance-dependence problem.

  33. Anonymous Says:

    Off-topic:

    Maybe I haven’t been paying attention, but what happened to Lev’s prizewinning question?

  34. Scott Says:

    It’s-a-comin’…

  35. Anonymous Says:

    “Isn’t addiction to ignorance more common, more enjoyable–and in the final analysis, more destructive–than substance addiction?”

    I agree with the other anon. This is a vacuous statement. We are all ignorant of a great many things and in fact there is no way to not be ignorant of something.