6.S899 Student Project Showcase!

As 2015 winds down, I thought I’d continue my tradition of using this blog to showcase some awesome student projects from my graduate class.  (For the previous project showcases from Quantum Complexity Theory, see here, here, and here.  Also see here for the showcase from Philosophy and Theoretical Computer Science.)

This fall, I taught 6.S899, a one-time “Seminar on Physics and Computation” that focused on BosonSampling, complexity and quantum gravity, and universality of physical systems.  There were also lots of guest lectures and student presentations.  Unfortunately, we didn’t do any notes or recordings.

Fortunately, though, the students did do projects, which were literature reviews some of which ventured into original research, and all nine have agreed to share their project reports here!  So enjoy, thanks so much to the students for making it a great class, and happy holidays.


Update (Dec. 23): Here are two conference announcements that I’ve been asked to make: Innovations in Theoretical Computer Science (ITCS) 2016, January 14-16 in Cambridge MA, and the Fifth Women in Theory Workshop, at the Simons Institute in Berkeley, May 22-25, 2016.

41 Responses to “6.S899 Student Project Showcase!”

  1. NG Says:

    Thanks for posting these. I am a theoretical computer scientist, and some of the stuff here (which I have read about in popular articles) looks fascinating, without actually understanding it in any details. Specifically, I mean tensor networks, quantum error correction, and AdS/CFT correspondence, TQFT. I was wondering if you could suggest some TCS-friendly sources that I could look into to get a better understanding of this stuff—just like for learning and thinking about quantum computing and quantum information one needn’t know about quantum mechanics. Or is it the case that one must learn GR, QFT,… to make sense of things like AdS/CFT correspondence, black hole information paradox, it from qubit…?

  2. Scott Says:

    NG #1: I’m working on some papers that explain my own view of the complexity theory aspects of AdS/CFT and the black hole information problem, and I’m someone who’s certainly not an expert on QFT or GR. But alas, they’re not done yet. In the meantime, the Harlow-Hayden paper is a pretty great source.

  3. Daniel Says:

    I found the second one particularly interesting. Is it coming out on the arXiv at some point? I’m finishing a paper with similar results for a different set of restricted circuits, rather than Clifford, and I would very much like to cite it 😉

  4. Marnie Dunsmore Says:

    Thanks for taking the time to post the write ups of the projects, as well as the conference announcements.

  5. Rahul Says:

    I never understood the point behind things like a “Women in Theory Workshop”? Do women feel they are not given a fair chance at all the regular TCS events?

    What if someone hosted a “White Men in TCS Workshop” wouldn’t we find that iffy? Ok, its not exactly the same thing given the historical baggage, but in general I’m skeptical of events & groups that fragment people needlessly into exclusive groups, whether the trait they are founded on be sex, race, ethnicity whatever.

    There seems a definite synergy in getting (say) all scientists (Male & Female) studying Spectral Gaps or Boson Sampling at one location. But is there a similar synergy in making exclusively female students listen to technical talks by exclusively female speakers?

    What’s next, an “Asians in TCS Summer School” & a “Latinos in TCS Conference”?

    If at all we need to foster ” kinship and camaraderie” isn’t it most needed at the weak interfaces between men & women?

  6. Scott Says:

    Rahul #5: While I (reluctantly) won’t censor discussion of such topics that takes place in respectful terms, I want people to know that I won’t be participating in it (nor will I participate in a meta-discussion about why I’m not participating, etc).

  7. Rahul Says:

    Scott #6:

    Fair enough. I won’t post any more on it either.

    Normally, I’d have considered it off topic for a technical blog & not posted in the first place but you seem to have in the past tolerated, nay encouraged discussion on all sorts of Nerd / Female / Israel / Schooling etc. social issues so I considered it fair game.

    Guess I was mistaken about what’s acceptable. Sorry about that.

  8. Scott Says:

    Rahul #7: It’s not a question of acceptable vs. unacceptable, but only of which discussions I do or don’t personally want to participate in at a given time.

    Maybe one general comment would be helpful, though. Given everything that’s happened on this blog, people might think of me as someone who chomps at the bit to take polarizing stances on hot-button social issues. But from where I stand, nothing could be further from the truth! I do occasionally feel “forced” into social discussions, because I see people who think like I do (nerds or whatever) getting cruelly attacked and I feel a moral obligation to say something in their defense. On the other hand, whenever I don’t feel under attack, my “happiness-maximizing default” is to spend my time thinking about CS and physics and math, and to defer on social issues to whatever seems like the consensus of the intelligent, well-meaning people around me (which, in practice, is basically always liberal—remember that I’m a professor in Cambridge, MA!).

    (One corollary: If you’re trying to shift my political sentiments rightward, make me feel under attack—for example, by sharing some clickbaity article on Facebook whose subtext is that nerds like me are the worst scum civilization has ever known. Go on, try it! If, on the other hand, you’re trying to shift my sentiments leftward, tell me how important nerds like me have been to the Enlightenment and the general progress of the human race, then ask me something about math or CS.)

  9. luca turin Says:

    Scott #8

    Question: How come the worst scum civilization has ever known has been so important to general progress of the human race?

  10. Scott Says:

    luca #9: Maybe Arthur Chu or Amanda Marcotte would enjoy tackling such a question; I don’t know!

  11. luca turin Says:

    God forbid! I only meant it as a way to hit you from both sides at once so you’d stay in the center 🙂

    Those student projects are amazing! Reminds me of Lev Landau saying “French can’t be that difficult if the kids speak it”….

  12. Joshua Zelinsky Says:

    Scott #8,

    “Go on, try it! If, on the other hand, you’re trying to shift my sentiments leftward, tell me how important nerds like me have been to the Enlightenment and the general progress of the human race, then ask me something about math or CS.”

    I don’t think this is really fair to the right-wing. My own politics lean left, but it is a mistake to think that the heirs of the Enlightenment are solely on the left.

    And now the requisite theory question: QMA(2) is apparently very hard to bound. Can we bound QMA(2) better if we make an unreasonable assumption? In particular, I can we at least prove that QMA=NP implies QMA(2) in PSPACE? Obviously, the beginning of the implication is almost certainly false, but I don’t see any way to get from that sort of severe assumption about QMA to anything reasonably bounding QMA(2).

    Also, related question: I know that Martin’s result that QMA(2) is in EXP is still not completely air-right, but you mentioned in that discussion the issue of whether QMA(2)^A is in EXP^A for any oracle A. Can you give any sort of guess of whether you actually expect this to be true, or whether this should be at least true for a random oracle?

  13. Scott Says:

    Joshua #12: Those are good questions to which I lack good answers. I don’t know how to get any better upper bound on QMA(2) than NEXP, even under the assumption QMA=NP. Come to think of it, even under the assumption P=PSPACE, I still wouldn’t have a better upper bound on QMA(2) than EXP (!).

    I think QMA(2)⊆EXP has to be regarded as open again, unless and until Martin manages to fix his proof. Before Martin’s paper, I’d never even considered as a serious possibility that we could have QMA(2)⊆EXP in a non-relativizing manner. But now, even if his proof ends up not being fixable, the very fact that the strategy looked plausible means that this has to be considered a serious possibility. Still, I guess my money remains on QMA(2)⊆EXP relative to all oracles (or, more-or-less equivalently, on there existing a quasipolynomial-time approximation algorithm for the Best Separable State problem).

  14. Rahul Says:

    “Leftward” and “Rightward” are such confusing & constraining things in US politics.

    e.g. I count myself “leftwing” on things like Gay Rights, Abortion, Evolution, Religion, Immigration etc. yet “rightwing” on Fiscal and Monetary policy, Corporate law, Labor Law , AGW legislation etc.

    Somehow, most discussion doesn’t seem to allow room for such mixed positions.

  15. Scott Says:

    Rahul #14: I would simply call that “libertarian” (some people call it “the gray tribe,” as opposed to the red and blue tribes). In academic environments, though, “libertarian” often gets rounded off to “right-wing,” since academics hardly ever encounter anyone who’s openly right-wing about gay rights, abortion, or religion—libertarians are the closest thing to right-wingers they normally see.

    I’m someone who’s also a bit hard to categorize, with leftwing sympathies on some issues and nerd/libertarian sympathies on others, but my highest sympathy is to a sentiment that Scott Alexander once expressed. To paraphrase slightly: “Arthur Chu argues that we should support anyone who has the right politics, regardless of whether they’re truthful or kind. I say we should support anyone who’s truthful and kind, regardless of whether they have the right politics.”

  16. Joshua Zelinsky Says:

    Scott, I don’t think when people use the gray tribe they mean libertarian. For example, Scott Alexander uses the term and he’s not a libertarian by most notions of the term.

  17. Scott Says:

    Joshua #16: I guess there are Fifty Shades of Gray Tribe. 😉

    You’re right, though: “libertarian” and “gray tribe” are not at all the same concept; they’re just two points in the same broad cluster (socially close to the blue tribe, but generally more pro-free-market, pro-nerd, and pro-STEM?).

    Scott Alexander, in particular, wrote the best refutation of pure libertarianism I’ve ever read, but I suspect he was only able to do that because he’s close enough to the libertarians actually to understand their arguments, and not replace them with strawmen!

  18. Dax Says:

    Daniel #3: Thanks for your interest in my paper – in fact, I just uploaded it to the arXiv (arXiv:1512.07892). I’d be happy to hear more about the restricted circuits that you consider. Feel free to contact me.

  19. Joshua Zelinsky Says:

    “You’re right, though: “libertarian” and “gray tribe” are not at all the same concept; they’re just two points in the same broad cluster (socially close to the blue tribe, but generally more pro-free-market, pro-nerd, and pro-STEM?).”

    I’m not sure how accurate that is either. There’s a substantial portion of libertarians who end up sounding very anti-nerd and anti-STEM because they are vocally against government funding of scientific research and more or less blame the nerds for that. What I think you are seeing is that there’s a segment of the libertarians that are more or less aligned somewhat with the grey tribe, just as there are some people who are greyish-blue and some who are greyish-red. I think? Maybe this whole grey thing isn’t a useful identifier.

    Also, as long as I am bugging you, and since you brought up P ?= PSPACE, question related to that: P != PSPACE seems to be sometimes described as the largest gap that we can’t prove, or something close to that. Can we actually prove that NC != PSPACE? NC is (probably) a little smaller than P. NC is contained in ACC_0, but Ryan Williams’s result that ACC_0 != NEXP doesn’t seem to help here.

    Is there some way of making a diagonalization argument of some sort or some other easy approach or is this open?

  20. Scott Says:

    Joshua #19: While this result seems to be missing from the Complexity Zoo (!), Uniform NC is contained in polylog space. So, that immediately implies NC≠PSPACE by the Space Hierarchy Theorem.

    Yeah, I suppose you could say that the Ayn-Rand-style libertarian is pro-nerd as long as the nerd is building something useful in private industry (preferably involving heavy machinery and slag…), but anti-nerd if the nerd is doing anything theoretical and/or funded by the government. But there’s how people classify themselves, and then there’s how an outsider would classify them. I imagine that Amanda Marcotte would regard Scott Alexander, Eliezer Yudkowsky, typical Bay Area gray-tribe coder types, Objectivists, effective altruists, and capital-L Libertarians as all just different species of whiny man-children (or hated collaborators, if female), who pretend to be enlightened and rational simply in order to hide their raging entitlement. Which, in turn, makes me want to run to the defense of all those people, despite their many differences with one another and my differences with them. 🙂

  21. Joshua Zelinsky Says:

    Scott @20,

    I don’t think that using how Amanda Marcotte classifies everyone as a large group together is generally useful. This is about as useful as that Andrew Schlafly classifying everyone who uses complex numbers as evil liberals.

  22. Joshua Zelinsky Says:

    And I’m unfortunately now out of complexity questions for now.

  23. Rahul Says:

    I’ve always found libertarianism’s arguments extremely elegant & enticing. e.g. Milton Freidman’s exposition of it. Or Tyler Cowen’s blog posts etc. (In fact, I’ve found almost no one on the left with a good set of arguments)

    My problem is, when I read those libertarian arguments being taken to their logical conclusions, I’m somehow uncomfortable with ( some of ) the conclusions. It’s a discomfort originating in gut feelings not rationality but it still makes me uneasy to go with the conclusions.

    e.g. Although I don’t have a good logical response to Freidman’s arguments I still want my national parks, US State Universities & doctor licensing.

  24. Scott Says:

    Rahul #23: In that case, Scott Alexander’s FAQ sounds like exactly the thing for you!

  25. Job Says:

    What’s the minimum information necessary to determine the number of ancillary bits required for reversible computation?

    For example, assuming that we don’t know the circuit that will be executed, is it sufficient to know the number of input/output bits and gate set? Or would we also need additional information such as maximum-depth?

    Essentially, given n+k bits, for some k corresponding to the number of ancillary bits, is it always possible to construct an n-bit circuit that cannot be made reversible without increasing k?

  26. asdf Says:

    Scott, I just read that Subset Sum (SS) and 3SAT are in GenericP, which I think means the set of hard instances is sparse. Does that mean that everything in NP is in GenericP since there’s a P-time reduction to SS and 3SAT? In the case of 3SAT it’s in GenExpP which means the vanishing of hard instances happens fast as the problem size grows. Does that tell us where we are in the Five Worlds?

    https://en.wikipedia.org/wiki/Generic-case_complexity

  27. Scott Says:

    asdf #26: No, there are no implications of that kind, because typical NP-completeness reductions don’t preserve denseness. So there’s no barrier whatsoever to some NP-complete problems being “hard in the generic case” and others being “easy in the generic case,” and that’s exactly what we find (assuming, say, the existence of one-way functions).

  28. Scott Says:

    Job #25: See my, Daniel Grier, and Luke Schaeffer’s reversible gate classification paper and the references therein. The short answer is that, if you don’t care about the number of gates, then you can do reversible computation with only a tiny number of ancilla bits (1 ancilla bit, for many natural reversible gates like Toffoli, and in any case a constant number of ancilla bits depending only on your reversible gate set).

    On the other hand, if you do also care about circuit size, then there can be tradeoffs between the number of gates and the number of ancillas. There are constructions by Bennett, Lange-McKenzie-Tapp, and others that give different points on the tradeoff curve between gates and ancillas (for taking an arbitrary computation and simulating it reversibly), but I believe it’s still not known whether these tradeoffs are optimal. (See here, here, here for example.) In any case, you can certainly improve over these bounds for certain specific functions or classes of functions.

  29. Rahul Says:

    Is there a list of problems somewhere which need NP time for even their generic / average case complexities?

  30. Scott Says:

    Rahul #29: Good question. I don’t know of a Garey&Johnson-style list, but this 2005 survey by Bogdanov and Trevisan is probably a good place to start looking. Unfortunately, compared to the theory of worst-case NP-hardness, the theory of average-case NP-hardness is a lot messier, with more isolated conjectures and fewer connections between the conjectures. (Or rather: to the extent that there IS nice theory, it tends to be for distributions over problem instances unlike those that would arise in practice.) Average-case hardness becomes cleaner if you go up to #P or PSPACE, or down to certain specific NP problems like discrete log.

  31. Rahul Says:

    I understand worst case complexity (I think).

    But is there an intuitive way to understand generic complexity & average case complexity?

    The explanations I found on Wikipedia were a tad too confusing.

    e.g.

    “An algorithm is in GenP (generically polynomial time) if it never gives incorrect answers and if it gives correct answers in polynomial time on a generic set of inputs.”

    So what exactly do they mean by a “generic set of inputs”?

  32. Scott Says:

    Rahul #31: I’ve never heard of GenP myself. The ones I know about are AvgP and HeurP. Yes, as I said, they’re sort of complicated and non-intuitive, since the “intuitive” definitions (e.g., uniform distribution over all inputs) don’t give you nice reducibilities. But please try the Bogdanov-Trevisan survey.

  33. Joshua Zelinsky Says:

    asdf#26,

    One thing to note is that while there aren’t results about density in that sense since as Scott notes reductions don’t preserve density, there are some very weak bounds that sort of amount to this sort of thing if one is willing to go for much weaker than a claim about density. I think it is true that if there’s an NP-complete problem where there’s an algorithm which solves the problem in polynomial time with at most O(log n) exceptions, and it responds “I don’t know” to the exceptions rather than giving wrong answers, then NP=P. The proof is essentially the same as the proof that if NP is in P/log then P=NP.

    Actually, question related to that for Scott: can we get this sort of result for some function that grows faster than O(log n)? We can if we weaken the conclusion from P=NP to the polynomial hierarchy collapsing (by using that NP in P/poly makes the hierarchy collapse) and that allows us also to make the algorithm be outright wrong on the exceptional values, but if we actually want P=NP as the conclusion (and not just hierarchy collapse or breaking ETH) is there any faster growing function we can do this for? It isn’t even obvious to me that we can do so for log n log* n.

  34. Jr Says:

    I (a man) used to find special conferences for women to be silly as well but I had a fellow grad student who attended one and she indicated that she liked. This caused me to change my mind , especially since I did not think she was the type who cared that she happened to be in a gender minority. My new view is that a lot of people tend to enjoy seeing people like themselves working in the same area and since gender is a fundamental part of our self-image for most people, they enjoy seeing other people of the same gender working with similar things.

    And I am libertarian enough to say that if people want to hold conferences with only women they are harming nobody and it should be allowed, regardless of motivation. Of course, I would see it as equally legitimate with a male-only conference which I suspect would be subject to a great deal of criticism. .

    I am not a full libertarian, though, for somewhat the same reasons as Scott Alexander. I think it is very difficult to justify treating private property as sacrosanct, though I think there are excellent reasons to treat it with more respect than is done at the moment, and as for the libertarian anti-paternalism we can refute it by noting that no one would apply it to children and yet nothing magical happens when we turn 18 (or 20 or 17 or whatever) that suddenly makes us fully rational and independent. I would not want the government to treat us like children but a little paternalism, sometimes, can probably be good. (I do mean a little, I would let you use drugs, sell sex and drive without a seat belt, but not all at the same time.)

  35. Nick Read Says:

    I like this idea of generic complexity. With average case complexity, say for the uniform distribution on all the instances of given size, a small number of hard instances could make the average complexity exponential, even though generically it is polynomial. It seems useful to know both statements, as in some physics problems about “rare events”.

    I would guess that MAXCUT is generically hard, though the few examples mentioned on Wikipedia don’t get to things of that type. I’m not really surprised that 3SAT is in GenP, given what we know about phase transitions for random instances.

  36. David Says:

    Jayson Lynch’s paper made my day. I’m still chuckling. Thanks for sharing it!

  37. Ian Says:

    Hi Professor Aaronson,

    Sorry for this off-topic post, but over on the polymath blog it seems a very interesting conjecture has been proposed and found to be true. It concerns the vanishing of the infinite sum of a formula involving irreducible polynomials over F2. I’m frankly bubbling over with curiosity for a better understanding of the possible consequences. I was wondering if you might have any thoughts on any of the below questions? If not or this is just simply an inappropriate comment, please feel free to moderate it out of the queue.

    1) What are the implications of finding this structure among the irreducible polynomials over F2? Are the implications limited because of being over F2 and because of being polynomials?

    2) Is there likely to be any structure at all among the interaction terms? If so/not, what are the implications?

    3) The prime numbers are often treated as if they are distributed randomly. Is there any sense in which the terms in this problem are considered to be distributed randomly, and if so, what does the “vanishing” mean in this context?

    4) Are there any immediate or indirect consequences for sums over the integers (in particular over the prime numbers)? I’m guessing not, but is there possibly a field and a formula for which the primes vanish?

    5) Again back to the randomness question, and still not quite understanding how the distribution of irreducible polynomials would be represented, does a result like this have any implications for separating pseudo-random sequences from truly random sequences?

  38. anonymous Says:

    Happy new year, Scott! Thanks for all the great posts (and subsequent discussions) in 2015. I hope the trend continues!

  39. asdf Says:

    Yes, I think generic complexity F(n) means that for problems of size N, all of them except for a subset of size E(n) can be solved in time F(n), where E(n)/a**n has to keep getting smaller, where a is the alphabet size. The algorithm has to give correct answers or “I don’t know”, but it can take longer than the F(n) time bound on the exception set. I haven’t yet gone over the exact definitions again to see if I have to follow up, but I think ExpGen means the density approaches zero faster than 1/P for any polynomial P. So since for any problem X in NP, an instance of size N can be turned into a 3sat instance of size poly(N) for some fixed (for the given X) polynomial, it sounded like there couldn’t be enough “hard” 3sat instances available within the size bound to have more than a few P instances be hard. I have to think about this some more though. It can’t be right.

  40. luca turin Says:

    anonymous #38

    +1!

  41. Seminários | Prof. Franklin Marquezino Says:

    […] estudar, refazer as contas e apresentar algum dos projetos desenvolvidos pelos alunos do MIT: https://scottaaronson-production.mystagingwebsite.com/?p=2606 (vejam os links para anos […]