Sundry and Various

1. There’s now a popular article by Lisa Zyga at physorg.com, about my paper with John Watrous on quantum computing with closed timelike curves. On the whole, I think Zyga did an excellent job at getting the facts (such as they are) correct.

2. Challenged ballots in the Coleman vs. Franken race: you be the judge!

3. One of the unfortunate things about not updating your blog often, I find, is that people assume you’re still obsessed with the last thing you blogged about, weeks after you’ve all but forgotten about it.  As it happens, I’ve now fully recovered from the joy of the election, and am back to my normal angst-ridden equilibrium.  On the other hand, I’ve not yet recovered from the STOC deadline.

4. My quest to become more obamalike in temperament is now officially a failure.   I should try it again sometime.

26 Responses to “Sundry and Various”

  1. Cody Says:

    If it makes you feel any better, I began to feel ashamed at the flamewar I was participating in, and eventually I just dropped the matter entirely, wishing I had exhibited the same self-restraint you strived for. So next time I am tempted to jump into what is bound to be a fruitless debate, I will at least consider my decision twice.

  2. Pat Cahalan Says:

    > people assume you’re still obsessed with the last
    > thing you blogged about

    Include a smattering of nonsensical or personal posts in your blog content, and this assumption goes away 🙂

  3. Sophie Yip Says:

    Scott: see if you find any of my favorite Beethoven’s music makes you feel better. Feel free to ask me questions before trying any of them out. It won’t mean that the value of your last answer to my question will fade, if someday (hopefully soon) it turns out to work well. 🙂

    Symphony No. 5 in C Minor
    Symphony No. 9 in D Minor
    Symphony No. 7 in A Major
    Piano Concerto No. 5 in E flat major
    Symphony No. 3 in E-flat major, “Eroica”
    Piano Sonata No. 23 in F minor, “Appassionata”
    Violin Sonata in A major, “Kreutzer”
    Violin Sonata in F major, “Spring”
    Piano Sonata No. 14 in C-sharp minor, “Moonlight”
    Piano Sonata No. 8 in C minor, “Pathetique”
    Symphony No. 6 in F major, “Pastorale”
    Symphony No. 8 in F Major

  4. Scott Says:

    Cody and Sophie: My normal angst-ridden equilibrium isn’t that bad. 🙂 Thanks for the concern though.

  5. Job Says:

    I have a question. Every problem in P has a polynomial time reduction to an NP complete problem – do you think that there is an overall upper bound for these reductions? For example, do you think that there is a k such that every problem in P can be reduced to an NP Complete problem in less than O(nk)? Or do you think that we can build problems in P such that their fastest reduction to an NPC requires as much [polynomial] time as we want?

    Showing that every problem in P can reduce to an NPC in less than O(nk) might show that P != NP. This because afterwards if it were shown that NPCs can be solved in O(nc) time after all, then every problem in P would be guaranteed to have a O(nk+c) solution – but, i imagine, we could then just show that we can build or find a problem in P that requires O(nk+c+1).

    So of course it would be too good if it were that easy, but it doesn’t seem unrealistic that every problem in P might be reducible to an NPC in O(nk) – especially if NPCs require exponential time. I mean, are there problems in P that can only be reduced to NPCs in O(n1000)? That would really surprise me!

  6. Job Says:

    WordPress ate the superscript tags – O(nk) means O(n^k) and so forth.

  7. asdf Says:

    Job, that sounds like a homework problem, so as a hint I’ll suggest you study the proof that there is such a thing as an NP-complete problem. It’s in Garey and Johnson’s old book and probably whatever the current books are too.

  8. Job Says:

    asdf, it’s not a homework question, i’m just curious.

  9. asdf Says:

    Job, ok, as I remember it Cook’s proof (in the book that I mentioned) that SAT is NP-complete, gives an explicit reduction (i.e. you can calculate the degree of the polynomial) from any problem in NP, to SAT. Of course any problem in P is also in NP.

  10. Scott Says:

    Job, for every k, you can find an NP-complete problem that all TIME(nk) problems (and indeed NTIME(nk) problems) can be reduced to in linear time. This problem is the following:

    “Given a description of an NTIME(nk) Turing machine M and an input x, does M(x) accept?”

    On the other hand, there can’t be a single NP-complete problem that every NP-complete problem (Karp-)reduces to in linear time. This follows from the Nondeterministic Time Hierarchy Theorem.

    Now, whether there’s a single NP-complete problem that every P problem reduces to in linear time is unknown. This can be seen to be equivalent to the question of whether P⊆NTIME(nk) for some fixed k. In one direction, if there is such an NP-complete problem, then it must be in NTIME(nk) for some k, whence every P problem is also in NTIME(nk). In the other direction, if P⊆NTIME(nk), then for every polynomial-time Turing machine M, there exists an NTIME(nk) machine M’ such that M(x) accepts iff M'(0x) accepts. We can easily make the problem of whether M’ accepts NP-complete, by stipulating that if the input to M’ starts with a 1, then M’ solves an NP-complete problem like SAT. Then our linear-time reduction from the P problem of whether M accepts to the NP-complete problem of whether M’ accepts is just to map x to 0x. 🙂

    It’s clear that if NP=EXP—or indeed, if there’s any time-constructible, superpolynomial f such that DTIME(f(n))⊆NP—then P⊆NTIME(nk) for some fixed k. But I don’t know the other direction; that is, I don’t know whether P⊆NTIME(nk) for some fixed k implies DTIME(f(n))⊆NP for some superpolynomial f.

  11. Sumwan Says:

    asdf, I don’t think this is what Job meant. I think he asked whether the time complexity of all polynomial time reductions from a P language to SAT was bounded by a single function (say,n^1000) .
    Cook built an instance of SAT which size was heavily dependent on the length of the computation history of the verifier. So his reduction does not answer Job’s question, as far as I understand.
    I think that the answer to Job’s question is no, intuitively I feel that the proof of the time hierarchy theorem could be adapted to prove that there are problems in P whose reduction to SAT would require an arbitrarily high exponent.

  12. Sumwan Says:

    Oops, sorry, I hadn’t seen that Scott had already answered.

  13. Sumwan Says:

    Can someone explain to me why it’s clear that:
    if there’s any time-constructible, superpolynomial f such that DTIME(f(n))⊆NP—then P⊆NTIME(n^k) for some fixed k ?

  14. Scott Says:

    Sumwan: Pick some time-constructible, superpolynomial f such that DTIME(f(n))⊆NP. Then the problem “given a Turing machine, does it halt in f(n) steps?” must be solvable in NTIME(n^k) for some k. But since f eventually dominates every polynomial, it follows that P⊆NTIME(n^k).

  15. Sumwan Says:

    Scott: Thanks!

  16. Job Says:

    But I don’t know the other direction; that is, I don’t know whether P⊆NTIME(n^k) for some fixed k implies DTIME(f(n))⊆NP for some superpolynomial f.

    Naive question, but if P⊆NTIME(n^k) for some fixed k, then what would be outside NTIME(n^k) and inside NTIME(n^(k+c)) for some c if not some element of DTIME(f(n)) where f is superpolynomial?

  17. Scott Says:

    Job: It’s plausible that DTIME(superpoly) would be contained in NTIME(n^(k+c)) in such a case; the whole trouble is proving it. Note that by the Nondeterministic Time Hierarchy Theorem, we can always construct artificial problems that are in NTIME(n^(k+c)) but not in NTIME(n^k). (For example: “Given as input an NTIME(n^(k+c)) machine M, does M accept?”)

  18. A. Shiekh Says:

    Are people really seriously expecting to use closed timeline curves?

  19. Job Says:

    Does P⊆NTIME(n^k) imply P != NP automatically or would it have to be shown that, in addition, DTIME(superpoly)⊆NTIME(n^(k+c))?

    I thought it would automatically imply that P != NP because if it’s shown that P⊆NTIME(n^k) then either:
    1. All NP-Complete problems are also in NTIME(n^k). In this case any proof that P=NP says that P⊆DTIME(n^k), which isn’t true.
    2. Some or all NP-Complete problems are outside NTIME(n^k) and so P != NP.

  20. Sumwan Says:

    Job: Yes, I think that by the Nondeterministic Time Hierarchy Theorem that Scott mentioned, for every k there are problems in NP that require more n^k on a nondeterministic Turing Machine, so P⊆NTIME(n^k) imply P != NP

  21. Job Says:

    Sumwan, of course, thanks. I guess what i don’t understand is why, if P != NP, it wouldn’t follow automatically that DTIME(superpoly)⊆NTIME(n^(k+c)). The problems in NP, outside NTIME(n^k) would have to be somewhere in DTIME(f), and f can’t be polynomial – what’s the other possibility?

  22. Sumwan Says:

    “The problems in NP, outside NTIME(n^k) would have to be somewhere in DTIME(f), and f can’t be polynomial – what’s the other possibility?”

    There is no other possibility, but this is not what
    DTIME(superpoly f)⊆NTIME(n^(k+c)) means.
    What the above inclusion means, is that there is a superpolynomial function f, such that ALL languages that can be decided by a deterministic Turing Machine in a time asymptotically less than or equal to f, can be decided by a nondeterministic Turing Machine in time n^(k+c).

    To give an example, suppose k=100, c=5, and f(n) = n^(lg n). The above inclusion would mean that for every language that can be decided in n^(lg n) deterministically, there would be a non deterministic Turing Machine that could decide it in n^105.
    By contrast, what you are saying is that if there is a problem that lies in NTIME(n^k+c) but not in NTIME(n^k),
    and assuming P⊆NTIME(n^k) then any deterministic procedure would take superpolynomial time to decide this problem, which is true, but not equivalent to
    DTIME(superpoly f)⊆NTIME(n^(k+c))

  23. Scott Says:

    Are people really seriously expecting to use closed timeline curves?

    A. Shiekh: I’m certainly not! My best guess would be that CTCs don’t exist, but that to understand why we’ll need to wait for a quantum theory of gravity. Using our best current theories—GR and QM—one can’t really exclude CTCs, and indeed if spacetime geometry is fluctuating quantum-mechanically, then it’s somewhat hard to understand why CTCs shouldn’t form. For that reason alone, I think it’s interesting to ask what the computational implications would be if CTCs existed. (The fact that the question turns out to have a clear nontrivial answer is just an added bonus.) After all, theoretical computer scientists already profitably study hundreds of computational models that are even less rooted in known physics! 🙂

  24. Anwar Shiekh Says:

    Sometimes, maybe we bury ourselves in terminology; would it not be more honest to ask straight ‘if we had a time machine that could send things back in time then would we would have a lot of computational power?’. At a time before sending the machine back, we would have the machine and its ‘image’ and we could send the pair back, etc; exponential computational power would then follow.
    I could try to argue that quantum gravity, since it does not yet really exist, might permit such things… but closed time-like curves arguably lead to inconsistencies (grandfather paradox), and any complete theory should be consistent (even if incomplete).

    Not that I have any real objections to looking at fun ‘what if’ scenarios.

  25. Scott Says:

    Anwar, it sounds like you haven’t read the paper. The two issues you raise (the grandfather paradox and the trivial solution “compute the answer and send it back in time”) are both prominently dealt with in the “Background” section—respectively, by Deutsch’s causal consistency criterion and the requirement that the CTC be polynomially bounded in size.

  26. Anwar Shiekh Says:

    I confess, guilty as charged; but now I think I should find the time to see how the grandfather paradox might be side-stepped.