Chicken Soup for the Complexity Soul

From the comments on my last post:

scott would you be so kind, if you have some spare time, to post a list of textbooks that you’ve read in math and cs?

Now this is the kind of blog topic I like: zero expenditure of emotional energy required; lends itself to snarky one-liners. So here’s a list of math and CS books that I’ve read and you should too. Pitifully incomplete, but enough to get you started.

  • Computational Complexity, by Christos Papadimitriou
    The Iliad of complexity. An epic poem to read and reread until you can quote it by section number, until the pages fall out of the spine. Christos is not a textbook writer but a bard — and the only problem with his tale is that it ends around the late 1980′s. Christos, if you’re reading this: we want an Odyssey!
  • Gems of Theoretical Computer Science, by Uwe Schöning and Randall Pruim
    The proofs are terse, but I love the division into short, digestible “gems.” Keep this one on your night table or by the toilet. (But not until you’ve mastered Papadimitriou and are ready to wield BP operators like a Fortnow.)
  • Quantum Computation and Quantum Information, by “Mike & Ike” (Michael Nielsen and Isaac Chuang)
    The best quantum computing book. I open it to the section on fidelity and trace distance pretty much every time I write a paper. (I’ve heard the other sections are excellent too.)
  • Randomized Algorithms, by Rajeev Motwani and Prabhakar Raghavan
    Chernoff bounds, random walks, second eigenvalues, PCP’s … a 1-1/e fraction of what you need to know about randomness.
  • Artificial Intelligence: A Modern Introduction, by Stuart Russell and Peter Norvig
    Almost (but not quite) made me go into AI. My favorite chapter is the last one, which carefully demolishes the arguments of John Searle and the other confuseniks.
  • Complexity and Real Computation, by Lenore Blum, Felix Cucker, Michael Shub, and Steve Smale
    Decidability of the Mandelbrot set? P versus NP over the complex numbers? I may be a Boolean chauvinist, but I knows an elegant theory when I sees one.
  • The Book of Numbers, by John Conway and Richard Guy
    Since this is a popular book, obviously I couldn’t have learned anything new from it, but it was nice to “refresh my memory” about octonions, Heegner numbers, and why eπ sqrt(163) is within 0.00000000000075 of an integer.
  • The Road to Reality: A Complete Guide to the Laws of the Universe, by Roger Penrose
    Preface: “Even if you hated fractions in elementary school, have no fear! I’ve tried to make this book accessible to you as well.”
    Chapter 2: “Consider a Lie algebra of sheaves over the holomorphic fibre bundle PZL(Zn,5)…” (Not really, but close.)
    I struggled through Penrose’s masterpiece, but by the end, I felt like I’d come as close as I ever had (and possibly ever will) to understanding “post-1920′s” particle physics and the math underlying it. If you’re willing to invest the effort, you’ll find The Road to Reality so excellent that it “cancels out” Shadows of the Mind, like an electron annihilating its positronic companion.

32 Responses to “Chicken Soup for the Complexity Soul”

  1. Anonymous Says:

    Interesting that you are very interested in AI and Quantum Computers and Complexity Theory, but you don’t see any connection between ALL 3. Or do you? I do, I see a very strong nexus in Bayesian Networks (probabilistic graphs)

  2. Scott Says:

    When I applied to grad school six years ago, my application was all about how I wanted to forge a connection between quantum complexity and AI. :-)

    But I quickly realized there wasn’t that much to say, beyond trivial applications of Grover’s algorithm, graphical models with amplitudes instead of probabilities, etc.

    Having said that, I’m writing a paper right now about the learnability of quantum states, which uses sample complexity bounds from classical learning theory.

    I’ve also been tossing around an open problem for years, of whether there exists an efficient quantum algorithm to learn small-depth neural networks. See here for more.

  3. Anonymous Says:

    Why is the book on Bounded Arithmetic by Krajicek not here? See also the new book (draft) by Prof. Cook.

  4. Scott Says:

    They’re not here because I haven’t read them (maybe I should have, but I don’t work on proof complexity). They might be excellent. In general, feel free to suggest your own favorite math/CS books.

  5. Anonymous Says:

    “But I quickly realized there wasn’t that much to say, beyond trivial applications of Grover’s algorithm, graphical models with amplitudes instead of probabilities, etc.”

    Trivial is what trivial does. “graphical models with amplitudes instead of probabilities” is a fair definition of a quantum circuit (and therefore a quantum computer). That clause implies that Complexity Theory has only trivial things to say about quantum computing. I think your original instinct was correct. Luke, follow the force

  6. Scott Says:

    “graphical models with amplitudes instead of probabilities” is a fair definition of a quantum circuit (and therefore a quantum computer).

    That’s wrong if the graphical model can have undirected edges, but let’s ignore that.

    Suppose we accept that quantum complexity theorists have “really” been studying graphical models all along — i.e. that every result about a quantum circuit can be restated as a result about a quantum graphical model. Who cares? The question is not “Can we use this formalism?”, but rather “Does this formalism lead to nontrivial or surprising new results?” I’m not prejudging the answer to that question; I’m just saying it’s what I want an answer to.

    In other words: show me the meat, and I’ll start paying attention.

  7. Anonymous Says:

    Well, asking me to defend the importance of Bayesian Nets in quantum computation, is like asking you to defend the importance of proving NP!=P in complexity theory. I could go on for hours. Here is just the beginning
    (1)Why graphs? Basically, in quantum computation, one multiplies a vast number of very large, sparse matrices . Graphs encode visually, and very efficiently, which terms of the sparse matrices vanish.
    (2)Why Bayesian Nets? We want graphs that refer to probabilities and enforce causality. It’s also very convenient if you can use the SAME graph to discuss both the quantum and classical-probability versions of a problem, which is possible with B nets (e.g., look at a discussion of Bell’s inequality from the point of view of Bayesian nets), but not possible with the standard quantum circuit diagrams. (Don’t get me wrong. I am not advocating replacing the standard quantum circuit diagrams by B nets. I see both types of diagrams as complementary) . Why is it convenient to discuss both classical-probability and quantum from the same graph? I find that understanding both the classical-probability and quantum versions of a problem at the same time is very enlightening; in particular, it highlights those aspects of the problem that are exclusively quantum.
    (3)How are Bayesian Nets connected to AI? I defer to the very rich literature on the subject.
    (4)How are Bayesian Nets connected to Complexity Theory? Again, I defer to the rich literature on the use of directed acyclic graphs in programming algorithms, linear programming, combinatoric optimization and complexity theory.
    (5)Is there an example where B nets resulted in the discovery of something in Quantum Information Theory that wasn’t already known? “Does this formalism lead to nontrivial or surprising new results?” Here is one example. The measure of quantum entanglement called squashed entanglement was discovered by looking at B nets. The discoverer noticed that CMI (conditional mutual information) vanished for a particular B net in the classical-probability case, but not in the quantum case. He then proved that squashed entanglement (which is defined for any density matrix) reduces for pure-state density matrices to the Bennett et al “entropy of entanglement”. He also showed that the non-negativity of squashed entanglement is guaranteed by the strong sub-additivity entropy inequality. (By the way, the discoverer does not call it squashed entanglement, a name that he finds kind of dumb; he calls it CMI entanglement. The name “squashed entanglement” was given to the entanglement measure by others who used it much later, after the discoverer told them about it and his papers on it.)

  8. Wim van Dam Says:

    I’m suprised that you did not list Sipser’s Introduction to the Theory of Computation. Was that on purpose, or did you never read it? This must be one of the clearest books on theoretical computer science. When teaching a course on its topics, you could practically say to the students: “Read this book. Let me know if you have any questions.”

  9. Scott Says:

    Wim: I’ve only skimmed Sipser’s book, but it did look extremely good — I especially liked the “proof sketch” preceding each proof. (This is the peril of listing the books that I myself learned from, rather than attempting an “objective” list.) Thanks for bringing it up.

  10. Shengyu Says:

    I guess Scott also often refers to some other books like “The probabilistic method” (Alon and Spencer), “Communication complexity” (Kushilevitz and Nisan), and some tool books for information theory, matrix analysis, graph theory, combinatorics, cryptography…

    Scott, this looks a very good topic. Maybe you and us readers can break it into the topics, and in each topic, list books and give reviews. Like your zoo, this may lead to a very useful resource for many researchers, from beginners to experts.

    For complexity, Oded Goldreich also post his new textbook on web. New stuff like U-Connectivity in L and reviews of the new proof for the PCP theorem. Sanjeev Arora’s may contain more topics, though the current version seems far from complete (in terms of typos, references, …)

  11. Anonymous Says:

    tao, mossel, elkies.

  12. Greg Kuperberg Says:

    I can’t comment about completeness or mistakes in Arora’s book, but I think that the style of the book is great! I hope that he finds time to finish this project.

    The one exception is the end-notes on the quantum computation section. Those are not so good. But that’s forgivable in a more general text.

  13. Scott Says:

    Scott, this looks a very good topic. Maybe you and us readers can break it into the topics, and in each topic, list books and give reviews.

    Shengyu: You should start a website for that purpose. On this blog, I want posts that are actually useful to anyone to be the exception, not the rule. :)

  14. Nagesh Adluru Says:

    Scott, does my request about your tips for dating I made earlier demand for emotional energy expenditure more than you like (or can)? :)

  15. Marc Says:

    This post has been removed by the author.

  16. Marc Says:

    Here are some books which I just love and consider
    myself much better off from studying.

    Kreyszig: Functional Analysis.
    This book is extremely thorough and very readable. It covers the main FA theorems and focuses on Spectral Theory of operators… Plus chapter 11 is “Unbounded Linear Operators in Quantum Mechanics”.

    Horn and Johnson: Matrix Analysis
    All the linear algebra an applied mathematician or computer scientist will need.

    Hardy and Wright: An Introduction to the Theory of NUmbers

    Some of the best mathematical writing there is. I always keep this book around for when I cant sleep at night or am bored. It covers the mathematics of the game of NIm!

    Nocedal and Wright: Numerical Optimization
    Covers everything you could need in basic
    optimization theory. Linear programming,
    nonlinear programming, CG, etc. ANd it
    very very readable.

    Of course, there is my favorite paper by a fellow
    poster to Scott’s blog:
    An Introduction to the Conjugate Gradient Method Without the Agonizing Pain

    There are more which I love from analysis,linear algebra,optimization, and complex variables, however the books above were indispensable to me education. If you are looking for other recommendations, let me know and I will see if I can help.

  17. L Says:

    If there’s a linear algebra book you love, it’s your duty to share it with the world.
    Also, I find it hard to reconcile that statement with the statement that it wasn’t indispensible to your education. My best theory is that you loved the book, but learned linear algebra elsewhere.

  18. Marc Says:


    I: There are a lot of books I enjoy reading but did not actually learn the material from. Examples of this is “Linear Algebra Done Right” by axler.

    Here Spectral theory is taught as it should be,
    without that pesky determinant. Proofs are
    more straightforward and more theory based
    then computational based.

    I also refer to Lax’s Linear Algebra for slick proofs of more advanced linear concepts such as matrix calculus, Duality, etc.

    On a side note, has anyone else observed the interesting evolution in mathematical education of the determinant?

    in high school, the determinannt it easy. given
    a 2×2 or 3×3 matrix, we follow the formula.
    In undergrad, it becomes slightly more confusing,
    with different algorithms for nxn matrices. Also, it is mentioned that it can be used to find volumes of parallelpipeds. Finally, in grad school it becomes the “Unique skew-symmetric multilinear form” which we can figure out gives us the above and corresponds to volume…

  19. Anonymous Says:

    A question about your algebra of books: Does Mathematica cancel out A New Kind of Science?

  20. Anonymous Says:

    I love Sanjeev’s use of “lowerbound” as one word.

    There was some amusement at MIT recently about this topic. I was writing a paper entitled “Higher Lower Bounds…”, and it took a non-complexity theorist to notice. After laughing about it for a while (and considering I hadn’t slept in 36 hours), the paper ended up with the title “Higher Lower Bounds for Near-Neighbors and Further Problems”.


  21. Scott Says:

    A question about your algebra of books: Does Mathematica cancel out A New Kind of Science?

    Now that I think about it, my algebra is non-commutative. First you sin, then you repent. It doesn’t work the other way around. The “virtuous” work might be what provides an audience for the “indulgent” one in the first place, but this is not the medieval Catholic Church: no buying of intellectual indulgences in advance.

  22. L Says:


    I’d been hearing the name of Axler’s book for years, but I hadn’t met anyone who’d read it. And I hadn’t heard of Lax’s book.

  23. Anonymous Says:

    Am I the only one who just doesn’t like Papadimitriou’s book? Yes, for a long time it was the only book out there, but I have never found it useful for learning the material, nor for re-learning it.

    As for randomized algorithms, the Motwani-Raghavan book is ok, but the new book by Mitzenmacher-Upfal is fabulous!

  24. David Molnar Says:

    So what kind of fiction do you read?

  25. aram harrow Says:

    To echo marc, I took linear algebra from Axler using his book (Linear algebra done right). It ruled.

  26. Scott Says:

    So what kind of fiction do you read?

    Most recently Portnoy’s Complaint. Over the past year I’ve also enjoyed several of Rebecca Goldstein’s academic novels.

    But the sad truth is I’ve been reading very little fiction lately. When I do read for pleasure, I tend toward histories, biographies, and Dave Barry.

    I went through a month-long phase back at Berkeley when I couldn’t do any research; all I could do was read one Shakespeare play after the next. Thankfully I recovered from that.

    My all-time favorite is Mark Twain.

  27. Cheshire Cat Says:

    Shame on you, Scott. Reading math is not reading for pleasure ?!

  28. Anonymous Says:

    My favorite math book of all time is Elements of Information Theory by Cover and Thomas. It is so clearly written that even a non-information theorist could learn a lot from it in the way of mathematical exposition.

    Another favorite is Gilbert Strang’s Introduction to Applied Mathematics. Great bedtime reading.

  29. Scott Says:

    Shame on you, Scott. Reading math is not reading for pleasure ?!

    Shame on you, cheshire cat. It is?!

  30. rdv Says:

    Scott, did you really get out of Berkeley without reading Hennessy & Patterson, or did you deliberately exclude non-theory books?

    How ’bout Feynman Lectures on Computation (I had the privelege of taking the course that the book is based on), and the followon Feynman and Computation? Both are just pleasurable reads. Material from Mead, Hillis, Sussman, Bennett, Benioff…

  31. Scott Says:

    Scott, did you really get out of Berkeley without reading Hennessy & Patterson


    (Sorry, PTSD)

  32. Helger Says:


    Experiences likely to induce the condition include:

    * Bad drug experiences (LSD, Magic Mushrooms)

    Explains some of the stories on this blog :-)