Five announcements

  1. For the next two weeks, I’m in Berkeley for the Simons program “Challenges in Quantum Computation” (awesome program, by the way).  If you’re in the Bay Area and wanted to meet, feel free to shoot me an email (easiest for me if you come to Berkeley, though I do have a couple planned trips to SF).  If enough people wanted, we could even do a first-ever dedicated Shtetl-Optimized meetup.
  2. More broadly: I’m finally finished my yearlong sabbatical in Israel.  At some point I’ll do a post with my reflections on the experience.  I’ll now be traveling around North America all summer, then returning to UT Austin in the fall.
  3. Longtime friend-of-the-blog Boaz Barak, from a university in Cambridge, MA known as Harvard, asks me to invite readers to check out his new free draft textbook Introduction to Theoretical Computer Science, and to post comments about “typos, bugs, confusing explanations and such” in the book’s GitHub repository.  It looks great!
  4. This is already almost a month old, but if you enjoy the quantum computing content on this blog and wish to see related content from our carefully selected partners, check out John Preskill’s Y Combinator interview.
  5. Here’s the text of Senator Kamala Harris’s bill, currently working its way through the Senate, to create a US Quantum Computing Research Consortium.  Apparently there’s now also a second, competing quantum computing bill (!)—has anyone seen the text of that one?

Update (June 16): Even though I said there wouldn’t be a meetup, enough people eventually emailed wanting to have coffee that we did do the first-ever dedicated Shtetl-Optimized meetup after all—appropriately, given the title of the blog, at Saul’s Delicatessen in Berkeley. It was awesome. I met people working on fascinating and important things, from cheap nuclear energy to data analytics for downballot Democrats, and who I felt very proud to count as readers. Thanks so much to everyone who came; we’ll have to do another one sometime!

25 Responses to “Five announcements”

  1. Daniel Bilar Says:

    Hi Prof. Aaronson and fellow readers

    I hope this note finds you all well. Re point 3) That HC/HU CS 121 course was listed in Prof. Barak’s blog post about Theory Life hacks II https://windowsontheory.org/2018/06/06/theory-life-hacks-ii/

    If you liked the course / slides, the post is useful run down of technological tools he used. Excerpt:

    “These days I use markdown for almost everything. For me it is basically plaintext where I can also write math between dollar signs. I use markdown to write my all my lecture notes including my 500+ pages intro to TCS book. I use it to write short technical notes and emails: in particular the Markdown Here extension lets me write math in gmail, and Hackmd is a good way to share short notes. Finally, I am also using it to produce presentations for my courses. (For one-off presentations such as talks, I still prefer PowerPoint.)

    For much of these I use pandoc to transform markdown into various formats, including latex/pdf, html, and html slides. I still use Atom as my main editor, and it’s only become better over time. I use the sync-setting package to synchronize settings between machines and autosave to avoid losing my work. It took me forever to figure out how to enable spell checking using the spell-check package on Windows. The trick is to enable the SPELLCHECKER_PREFER_HUNSPELL environment variable.”

    There’s more and also a part 1.

    Thanks and have a great day
    Daniel Bilar

  2. birdbrain Says:

    I would come to a meet up if that makes any difference 🙂

  3. Eel Gardener Says:

    If the meetup were after the 24th, I would come!

  4. Edan Maor Says:

    Looking forward to your thoughts on the Israel trip!

    Point number 3 brought to mind something I meant to ask you – what is your go-to recommended textbook these days for introducing people to CS (specifically computability/complexity theory, not necessarily quantum computing yet), for someone with some math/CS background but not necessarily a formal background.

    I ask because I want to (re)learn computability. I did my degree with Sipser, which was great, and my default is to go back to that, but so many new textbooks have come out that I’m wondering if I should read something more updated.

  5. Daniel Says:

    Hi Scott,
    Just curious, are you interested in blockchain? What do you think of it?

  6. fred Says:

    Will be interesting to hear your thoughts on living in Israel…
    I was just reading an article yesterday on the widening divide between the US and Israeli Jewish communities.

  7. Scott Says:

    Daniel #5: Yes, of course I’m interested; cryptocurrency now provides one of the main application areas for a lot of what we think about in theoretical computer science.

    On the other hand, every time I think about it, I just get depressed by the fact that I didn’t buy Bitcoin a decade ago, so then I think about something else instead. 🙂

    I was aware of Bitcoin early on, but I knew it mostly as something I had discuss in talks about quantum money, to explain why quantum money could (eventually, once it became technologically feasible) be better than a blockchain.

    I also tried to set up a Bitcoin wallet years ago, when someone for whom I’d done some consulting work wanted to pay me in Bitcoin, but I found it complicated and onerous beyond belief, until I finally gave up and insisted on being paid by check (Bitcoin has since then increased more than 100x in value). It was hard to imagine that anyone would want to use this as a regular medium of exchange. Which—indeed—they don’t! The delays and transaction fees make it wildly impractical for everyday use. But, I suppose like almost everyone else, I’d failed to factor in human psychology, to foresee how excited (rightly or wrongly) everyone would become, and how much that would drive up the value.

    Incidentally, I have a couple friends who did have the foresight to invest in Bitcoin early. But alas, their investment was housed in … Mt. Gox. I know others who invested in Bitcoin early, but also probably in a hundred other crazy-sounding things that lost money … they’re just that kind of person! The whole thing has a Wild West character that would probably require changing my personality to fully get into.

  8. Scott Says:

    fred #6: Do you have a link to that article?

  9. fred Says:

    It was on a printed newspaper here in NYC, about a recent poll – http://jewishweek.timesofisrael.com/poll-shows-deep-divide-between-israeli-and-american-jews-on-trump/

  10. Scott Says:

    Eel Gardener #3 and others: Sorry for the delay! A few readers wanted to meet, but probably not enough to justify a full “meetup.” So, tell you what: please just shoot me an email if you’d like to meet for breakfast or lunch or coffee or something in Berkeley this weekend.

  11. Scott Says:

    Edan #4: For introductory computability theory, Sipser’s textbook is still my favorite among those I’ve seen. For more advanced computability theory (e.g. intermediate degrees, computable measure theory and probability theory, etc.), I’m not sure—to be honest, I haven’t yet found any textbooks that are as clear and accessible as I’d like. (Maybe someone else has a suggestion?)

    If, on the other hand, you meant “computability theory” in a broader sense, also encompassing complexity theory, then there lots of great books to choose from. Papadimitriou’s Computational Complexity is a classic. Arora-Barak and Mertens-Moore are more modern, up-to-date treatments that I strongly recommend. And a personal, idiosyncratic favorite is Uwe Schoning’s “Gems of Theoretical Computer Science.”

  12. Jalex Stark Says:

    > For more advanced computability theory (e.g. intermediate degrees, computable measure theory and probability theory, etc.), I’m not sure—to be honest, I haven’t yet found any textbooks that are as clear and accessible as I’d like. (Maybe someone else has a suggestion?)

    When I took the computability course at Caltech, the professor’s official recommendation was to not attempt to read any textbook. The relevant wikipedia pages aren’t too horrible, though. There are also some fairly active logicians on math.stackexchange who can answer basic questions / point people to the literature on very specific subjects.

    You can also get some bits of good computability content by reading descriptive set theory books, of which there’s at least one decent one (by Kechris). This also gives on the background needed to read some more modern logic papers which may have the computability one is looking for.

  13. Jalex Stark Says:

    Also #4: If what you mean by computability is really _complexity_, then I recommend Papadimitriou’s book. Arora and Barak is a more standard suggestion, (in part because it has a lot of important results that weren’t around when Papadimitriou wrote his book) but I found both the prose and the problem statements significantly more compelling in the other. Also I think the difficulty throughout is pretty well-tuned for someone who’s already worked through the complexity in Sipser.

  14. mjgeddes Says:

    Yes,

    Crypto-currency is fascinating indeed Scott. A mixture of three fields there I think – the block-chain itself is a distributed data-base technology. The crypto is in the domain of computational complexity theory (coding theory). And the financial side of things, economics. A heady mix.

    I really don’t think it was rational to invest in Bitcoin though. Some people got rich, but basically, in my view, they got lucky. You know I won the NZ national lotto twice? That doesn’t mean it was rational for someone to spend up on buying tickets with my winning numbers prior to the draw. It was just luck. Same with bitcoin I think. No rational reason, it’s just a craze where some people got lucky.

    As regards intros to theory of computation and complexity theory, one could try my wiki-books of collected wikipedia links 😉 Here;

    https://en.wikipedia.org/wiki/User:Zarzuelazen/Books/Reality_Theory:_Computation%26Complexity

    Essentially, 3 fields rolled into one there:

    Information&Coding Theory (inc. Cryptography)
    Computational Complexity Theory
    Automata Theory

    Closely linked yes? Information&Coding Theory the practical level, Computational Complexity and Automata theory the theoretical levels (fine-grained and coarse-grained).

    Note the pattern! I always split a knowledge domain into 3 sub-levels: 1 practical level, 2 theoretical levels (a fine-grained one, and a more abstract one).

    Cellular Automaton and Conway’s game of life – strongly suggestive links to evolution and fitness landscapes makes me think that Cellular Automaton should be the key computational model for AI research (specifically, the study of optimization)!

  15. James B Says:

    Unrelated to anything here, but has there been any further progress on bounding the smallest Busy Beaver number independent of ZFC? (Or, better question, “what should i type into Google to find the answer myself?”)

  16. Edan Maor Says:

    Thanks everyone for the suggestions! I did indeed mean both computability theory as well as complexity theory, so many relevant suggestions.

    Actually, that brings to mind a follow-up question – how much computability should I learn before starting with complexity theory? Or an even more pointed question – I am under the impression that computability theory basically hasn’t advanced much in the last while, and most people are working more on results in complexity, which is the “natural progression” after computability. Is this impression right? Or are there still fundamental *computability* results that are being worked on?

  17. Scott Says:

    Edan #16: I think it’s a fair statement that most of the remaining open problems in computability theory, the ones worked on today, are rather abstruse and hard for outsiders to appreciate. Certainly they have nothing analogous to P vs. NP. Furthermore, modern work in computability theory is mostly done in math departments; within CS departments, almost all work in theoretical computer science since the 1970s has involved considerations of complexity and efficiency.

    Having said that, there is great work being done today at the intersection of computability theory with probability theory, measure theory, and (so I’m told…) programming language semantics. And many of the classic results of computability theory (not just Church and Turing, but the later work on recursively enumerable sets, Busy Beaver, Kolmogorov complexity, Chaitin’s Ω…) were triumphs of human reason. So it all depends on what you’re interested in and where you want to go.

    Yes, you could study computability first, but you could also jump straight to complexity after only the tiniest smattering of computability. There are certain topics in complexity, like the polynomial hierarchy and relativization and intermediate degrees, that will make more sense if you’ve first seen the analogous topics in computability, but even there you could, anachronistically, learn the complexity versions first and then the computability versions (as Lance Fortnow once put it, “the Friedberg-Muchnik Theorem is just Ladner’s Theorem for computability!”).

  18. Scott Says:

    James B: I believe that Stefan O’Rear, building off my and Adam Yedidia’s paper, eventually managed to get things down to a 748-state Turing machine that halts iff ZFC is inconsistent (improving over my and Adam’s bound by an order of magnitude), and that that still stands as the world record. For more, see Stefan’s Github page (he never wrote up a research paper, but you can find all his code and documentation there).

  19. Daniel Says:

    Scott #7,
    Thanks for sharing your interesting experience with Bitcoin 🙂
    Coming from a Physics/Maths background I can’t help but notice that there might be something we are be able to learned from the blockchain. Firstly, solving double-spending is mainly about coming to agreement on the order of transactions, for all the nodes in a distributed system. This sounds quite analogous to how clock is synchronized in Relativity, since there is no such thing as absolute time (no trusted third party in blockchain to determine the universal time order).
    Secondly, the older transactions are more certain than later ones, as they are hashed many more times than the later ones. (A good analogy is the amber.) So, the entropy is the lowest for earliest transactions, and gradually increases with time. This sounds like an arrow of time, arising naturally from some kind of repeated operations.
    Thirdly, it has been shown recently that chaotic stirring of viscous fluid satisfies properties of a hash function.
    Then, of course, it is the problem of double-spending without a trusted third party, which was first solved by bitcoin in the classical domain. Then people recognised that quantum domain naturally allows a solution to the problem due to no-cloning theorem, which leads to quantum money, a topic you have worked on. But the significant difference between bitcoin and quantum money is that in the former there are only operations on ledgers, without any coin literally moving around. The ledgers themselves are the coins. In quantum money we need quantum channels, and perhaps things are happening through these channels.
    Yeah, in short, I think Phsics could be inspired by some of the cool ideas in blockchain.

  20. mjgeddes Says:

    Close analogies between computational logic, probability theory and complexity theory become apparent when you perform a splitting into 3 levels of abstractions! Look:

    Computational logic_______Probability&Stats_________Complexity&Computation:

    Functional Programming____Statistics_______________Information Theory/Crypto
    Many-valued logic_________Probability______________Complexity
    Grammar frameworks______Stats Models____________Computability/Automata

    See the 3 levels? First line is practical (procedural knowledge). Second line is low-level theoretical. Third line is high-level theoretical.

    Conjecture: Equivalence between grammar frameworks, stats models (graphical, Markov) and abstract machines (cellular automata)?

    More interesting still, the third line domains seem to blur into artificial intelligence/cog-sci fields, thusly;

    Inference_____________AI

    Grammar frameworks___Reflection/Conceptual Trees
    Stats models__________Prediction/Neural Networks
    Automata_____________Optimization/Fitness Landscapes

  21. Roy Abrams Says:

    Rather impressed that one of our legislators has the foresight to encourage investment in a program that will pay dividends over the long term, whose tangible benefits are still hypothetical at this point. It’s not the the typical pork appropriation you usually see.

  22. Stella Says:

    When I took my first advanced computability theory course (based on Scott’s topical characterization), we used a manuscript that Richard Shore had put together. We used it more as a reference than as a textbook, but the parts of it I read were helpful and enlightening. I looked around Prof. Shore’s website and there’s no mention of the manuscript and he hasn’t released a book recently, so I don’t want to provide the document. This course was in early 2016, so hopefully it’ll be published soon and y’all’ll be able to access it. Prof. Shore had provided it to my professor to beta test on the course.

    The table of contents is:
    0 Introduction
    1 An Overview
    2 The Basics
    3 The Turing Degrees
    4 R.E. Sets and the Turing Jump
    5 Embeddings into Turing Degree
    6 Forcing in Arithmetic and Recursion Theory
    7 Theories of D and D(\leq 0′)
    8 Domain Properties
    9 Minimal Degrees and Their Jumps
    10 Latice Initial Segments of D
    11 Pi_0^1 Classes
    12 Pseudo-Jump Operators: Defining A
    13 Global Results
    14 Defining the Turing Jump

    Hopefully this gives insight into the contents of the book. If anyone is/knows Prof Shore and has insight into where the book is and if it would be okay for me to share in this venue, I would appreciate that knowledge.

    Scott said “Furthermore, modern work in computability theory is mostly done in math departments; within CS departments, almost all work in theoretical computer science since the 1970s has involved considerations of complexity and efficiency.”

    This has been precisely my experience as a theoretical computer scientist. I would go a bit further and say that logic as a whole seems to be increasingly the domain of mathematicians rather than computer scientists. Bias disclaimer: I have a degree in each of math and cs but consider myself a mathematician.

    I studied complexity before computability (in fact, my first computer science course ever was on complexity and algorithms), and made exactly the same remark that Lance Fortnow apparently did, about the Friedberg-Muchnik Theorem and Ladner’s Theorem. I think that a lot of people see complexity theory as more accessible than computability theory, in part because complexity theory research tends to be easier to understand than computability theory research. Most undergraduates are not learning measure theory or category theory in their first few years, which makes talking to professors whose work revolves around those tools much harder. On the other hand, even advanced problems in combinatorics and complexity theory are methodologically accessible to undergraduates because Fourier analysis and group theory are things that they tend to be more familiar with. Yes, some of complexity theory does lean heavily on category theory, but I see that as more the exception than the rule. I also think that it’s a lot easier to communicate the underlying ideas and motivations of those problems without using category theory.

  23. Daniel Seita Says:

    Wow, welcome back to Berkeley! 🙂 Hopefully you have good memories of this place. I’m amazed that you’re open to meeting with blog readers (not to mention debating the random [insert political group that’s far away from your politics] reader that comes to this blog). Too bad I missed this meet-up but I had a full day so would have been hard for me to come.

  24. Paul Crowley Says:

    I hope your visit includes time spent with some of the fine folk in Berkeley like rival Caliphs Scott Alexander and Eliezer Yudkowsky!

  25. Scott Says:

    Paul #24: I saw or am seeing many friends from the rationalist world, like Julia Galef, Sarah Constantin, Paul Christiano, Stuart Armstrong, and others. I would’ve loved to see the other Scott but (as he said on his blog) he’s away on vacation. I’d also gladly see Eliezer again but he hasn’t answered my email—probably too busy trying to save the universe or something.

Leave a Reply