Quantum Computing Since Democritus Lecture 19: Time Travel

A visitor from the year 2006, this time travel lecture appears before you now due to a blip in the space/time/procrastination continuum.  No grandfathers were harmed in the writing of it.  I’m looking backward to your comments.

(Alas, these forehead-bangingly obvious lines can now never be unwritten … or can they?)

32 Responses to “Quantum Computing Since Democritus Lecture 19: Time Travel”

  1. Ian Durham Says:

    Good lord, this is odd timing. I’m rewriting my paper on CTC qubit communication (since it was incomprehensible) as I read this (well, I just closed TeXShop since it’s 11:30). Maybe this is all part of some giant CTC. Maybe I should become a Hindu.

  2. Ashwinkumar Says:

    hi

    How can we read the output of a quantum computer acting inside the CTC. Doesn’t it destroy the output? Is this the reason for a QCTC and ClassicalCTC having same power?

  3. Scott Says:

    Ashwin: Let me apologize; I didn’t explain that in the lecture and definitely should have.

    The key point in Deutsch’s model is that you have both “CTC qubits” and “chronology-respecting qubits”, which can interact with each other as the computation progresses. What you get to see in the end is just the chronology-respecting qubits (the CTC qubits being stuck inside the CTC). But the state of the chronology-respecting qubits can depend on their having interacted with the CTC qubits. So for example, you can read out (classical) information from the CTC qubits by just CNOT-ing it into the chronology-respecting qubits, and indeed that’s what’s done in the algorithm for simulating PSPACE.

    The reason for quantum and classical CTCs having the same power has nothing to do with this. Rather, it’s that finding fixed points of n-qubit superoperators can be done in PSPACE (being an exponential-size linear-algebra problem). At a more intuitive level, the fact that PSPACE=BQPSPACE (i.e., classical and quantum PSPACE are equivalent) might already lead you to guess that PCTC=BQPCTC. And indeed that turns out to be the case, though the latter is harder to prove.

  4. DavidM Says:

    For another paradox how about this:

    Start your computer running then wait an hour.

    1. Go back 5 minutes and pick up the computer.
    2. Repeat #1 until you have 12 computers.
    3. Leave the 12 computers and wait another hour….

    Continue until you have Google level computational ability.

  5. Scott Says:

    Nope, doesn’t work. For one thing, it’s not a causally consistent evolution.

    (It’s also unnecessary, if you merely want Google level computational ability…)

  6. DavidM Says:

    I don’t understand. Do you mean that the computer you have with you will disappear when you back in time the second time?

    The computer *has* to exist in the past otherwise you couldn’t have picked it up in the first place.

  7. Moshe Says:

    Nice lecture as always, but one nitpicks, so when I see the sentence “General relativity is not just a theory of some fields in spacetime, but of spacetime itself”, I am wondering if there is any operational difference between the two alternatives, or we are talking about religious affiliations, similar to one’s interpretation of quantum mechanics. The premature adoption of this viewpoint as being the only option leads directly to sentences like “a quantum theory of gravity is going to have to be talking about superpositions over spacetime”, which I think is simply false. There are perfectly good theories of quantum gravity around, none of them involves superpositions of spacetimes. The converse is also true.

  8. Scott Says:

    No, I don’t mean that. I mean that that whole way of thinking about it is inconsistent. For example, if you just keep going back in time over and over, increasing the number of computers each time, then does the total number of computers increase without bound? But in that case, why don’t you encounter an infinite number of computers at the very beginning of the process? Why are you ever in a situation where there’s only one computer? To take another example, what would happen if, on finding the computer working, you go back in time five minutes and smash the computer with a hammer, but otherwise you do nothing? Any account of CTCs needs a logically consistent way of answering these questions.

    The whole source of the difficulty with the naïve view is that it refuses to take seriously that you’re going back to your own past. It imagines that going back in time simply means “buying some extra time,” and that when you go into the past, you’re free to do something other than what you actually did.

    Fortunately, these difficulties can be resolved and have been resolved, for example by Deutsch’s causal consistency proposal. This proposal is not an obvious one, but is pretty compelling once you understand it. Please read the lecture if you’re interested.

  9. Scott Says:

    Moshe: It’s an interesting question, and I’m sure you understand the issues better than me. For whatever it’s worth, here’s how I think about it: if you start from some initial conditions and run forward the Standard Model without gravity, you may not know what’s going to happen, but you at least know that the causal structure is going to be isomorphic to a collection of points in R4, which are partially ordered by the usual light-cone relations. By contrast, if you start from some initial hypersurface and run GR forward, you don’t know in advance what the causal structure is going to be, even up to isomorphism. That depends on the dynamics (e.g., how many black holes form). And if the causal structure is a dynamical part of the theory, that suggests that when we quantize the theory, we’ll ultimately need to be talking about superpositions over different causal structures. (Feynman seems to have been making this last point in 1957: see this interesting article by Zeh.) That’s not to say one can’t make terrific progress on quantum gravity while sweeping the superpositions-over-causal-structures issue under the rug. Indeed, one could argue that the issue is premature, and we should sweep it under the rug. However, what I don’t understand is how one can avoid talking about it eventually.

  10. Moshe Says:

    Here is a possible loophole, the jury is still out on this but it is at least a possibility, which I personally find plausible. Classical spacetime should be viewed as a long distance manifestation of some more fundamental structure. The same way that ocean waves are made out of enormous number of microscopic ingredients (metaphor due to Ted Jacobson in a slightly different context). Now, we don’t ask which kinds of waves are involved in a quantum theory of ocean waves – we quantize the microscopic ingredients, so the question doesn’t make any sense. Similarly, I don’t think that quantum theory which include gravity in long distances is necessarily a theory of fluctuating metric, all evidence is to the contrary. So asking which kind of virtual metrics (and their causal structure) is involved in quantum gravity is just not a well-defined question, it makes implicit assumptions that are likely false.

  11. Scott Says:

    Moshe, I completely agree that there might be more fundamental ingredients underlying spacetime. But then we should still be able to talk about superpositions over those ingredients, and those superpositions would presumably give rise to superpositions over causal structures. By analogy, there’s not a quantum theory of ocean waves, but (at least if QM is universally valid) there are superpositions of ocean waves. And the existence of superpositions over causal structures, regardless of its source, already seems incredibly hard to make sense of.

    Incidentally, when I used the word “quantize,” I meant “make quantum-mechanical in whatever way,” not “apply a quantization procedure directly to these variables.” I don’t know if the former usage is standard; I apologize if it isn’t.

  12. Moshe Says:

    OK, that is a good question, not sure what the answer is. To make it a little sharper we can ask if metrics with different causal structures (meaning the many-body quantum states they approximate) are members of the same Hilbert space, and I don’t know the answer to that. My gut feeling is no – every time you try to change the causal structure semi-classically you encounter divergences of some kind, infinite energies, collapse to black holes etc etc., which I read to mean that these two configurations are separated by an infinite barrier in the full configuration space. But maybe that an artifact of not having a full treatment of the problem.

    My prejudice is also that even if superposition of classical metrics with different causal structures is a good state in the quantum theory, that state likely has no good classical limit, is not a spacetime of any kind, and the causal structure of it is just not well defined. All of these are just a gut feelings, no more.

    As for the usage of the word quantize, I find that too many people refer to quantum gravity automatically as the theory of quantized metrics. Lots of the conceptual and technical problems disappear if you use the term “quantize” more precisely as you do above.

  13. John Sidles Says:

    Speaking from the point of view of practical engineering—if the work “practical” has any meaning with respect to quantum gravity—the Penrose point of view (as I understand it) is particularly attractive.

    This quantum-geometric-dynamical point of view asserts: (1) The metric of classical space-time is dynamical and curved (“matter tells space-time how to curve, space tells matter how to move” in Wheeler’s famous phrase), and (2) ditto for Hilbert space. 🙂

    It turns out that it is perfectly feasible to do quantum mechanics on curved Hilbert spaces, in the sense that the resulting quantum physics is entirely orthodox—Dave Bacon and I had a conversation about this just yesterday.

    In fact, from the viewpoint of computational complexity, it turns out to be remarkably easy to do some classes of quantum calculations on curved Hilbert state-spaces (compared to classical calculations on curved space-time) essentially because Kählerian geometries are geometrically “nicer” than Riemannian geometries.

    This is one of several senses in which quantum mechanics makes promises about dynamic processes that are much stronger (mathematically) than the promises of classical mechanics. These strong promises are good news for mathematicians, scientists, and engineers.

    That is one of the most general heuristic lessons of twentieth century mathematics, physics, biology, and engineering—generally speaking, state-spaces are themselves dynamical objects.

  14. Todd Brun Says:

    Still waiting for the paper, though!

  15. Scott Says:

    itsacomin’

  16. John Sidles Says:

    Hi Todd! I hope you don’t mind some fan mail expressing appreciation for your fine articles on topics relating to quantum decoherence and simulation. In particular, a 1997 article that you wrote with Rüdiger Schack: A C++ library using quantum trajectories to solve quantum master equations is really excellent.

    Your and Schack’s article was IMHO well ahead of its time (this was one of the things I was discussing with Dave Bacon). In particular, many of the techniques that your computational library implements on flat Hilbert state-spaces translate well to curved Kählerian state-spaces.

    There is of course some extra work in keeping track of vectors versus one-forms, etc., but (as mathematicians have long known) this work is immensely easier on Kählerian manifolds than on Riemannian manifolds.

    The physical point is that by embedding quantum trajectories on a suitably-curved Kahlerian manifold, the state-space geometry itself implicitly handles much of the computational complexity of calculating the trajectory. So the resulting codes are tremendously fast!

    Our UW QSE Group is now documenting our Kählerian codes, not as a standalone library, but rather as a module for the well-known SIAM compendium Templates for the Solution of Linear Systems: Building Blocks for Iterative Methods. We would be very interested to hear from anybody else who has written (or is writing) quantum trajectory-tracing code along similar lines, particularly codes that are (or could be) naturally embedded and/or projected onto curved quantum state-spaces.

    As for the relevance to fundamental physics, well, if Nature is a quantum simulation, and quantum simulations are more efficient on (curved) Kählerian state-spaces, then doesn’t it follow (heuristically) that the quantum state-space of Nature is itself likely to be dynamically Kählerian?

    That would be fun! 🙂

  17. ScentOfViolets Says:

    I’ve never gotten a good nontechnical reply (for suitable values of nontechnical), but isn’t the Einstein Block Universe model and any other model relying on QM basically incompatible? By QM, I mean in this case quantum indeterminism? Or is this a case where there are many histories, but only one future?

  18. harrison Says:

    Dammit, Scott, I finished the part about P_CTC, got up to get a Coke, and started thinking about BQP_CTC, only to come back and find you’d gone and answered it. 🙂

    But, okay, looking at it from a different perspective: If we only allow our quantum computer to do a computation on a sample from a classical distribution, instead of sampling from a mixed state, then isn’t it way easier to prove that that class is equal to PSPACE? (AFAICT, it’s essentially the same proof as for P_CTC!) Or am I totally off the mark here?

  19. Job Says:

    I just thought of the following. If you have x bits in your machine and want to upload them to a server, you have to send x bits over (if you want to be able to upload any possible x bits).

    However, if the machine and the server are synchronized, evolving according to some function, then you can just wait for the server to reach a state corresponding to the bits you want to upload, and send a single bit, arriving exactly when the server enters the target state, which the server converts into the uploaded bits. So just by bringing time into the process, we are able to compress any bit string to one bit.

    Of course the upload wouldn’t be very fast, but think of the bandwidth we’d save. These tradeoffs between time and space i think are really interesting.

  20. Scott Says:

    Harrison: Yes, it becomes much easier.

    Job: Good point!

  21. Job Says:

    Here’s some more food for thought. If the server state were evolving according to some increment function, then in order to be able to upload any x bits, we’d have to wait on average 2^x increments, or 2^x operations.

    Suppose the server is always incrementing,. The first bit signals the start of an upload. Then, for every new bit that arrives, the number of increments (or number of operations really) performed since the last bit arrived is taken to be the uploaded value.

    Not that great, but we can compromise in the following way. When we have to upload x bits, we split them into x/logx groups of logx bits. For uploading each group we have to send 1 bit and wait for 2^logx = x operations to elapse at the server. That means we send 1 + x/logx bits over and wait for a total of x^2/logx operations to elapse at the server. That’s not that bad. How’s my math?

    (i guess if we think of an operation as a “time bit”, then we can store x space bits in (1+x/logx) space bits and x^2/logx time bits)

  22. asdf Says:

    What’s this thing on slashdot about quantum Viterbi decoding? Is it relevant to quantum computation?

  23. ScentOfViolets Says:

    Does ‘elapsed time encoding’ really work that way? This sounds like a demultiplexer scheme where n bits determines which of 2^n output lines to select. Iow, a gain in transmission is matched by a (massive) loss of hardware resources.

    Also: Can someone explain how an Einstein block universe – completely deterministic insofar as I know is compatible with quantum indeterminism? The one implies completely static world lines; the other something that is ‘free’ in some sort of supertime. But if the latter case is true, doesn’t that add an extra ingredient, a ‘preferred’ present? Maybe my model is not accurate. I think of the universe in this sense as a circuit, with times arrow given by the direction of inputs and outputs. In the Block universe, the circuits are immutable, as are the initial inputs. The qm model, and specifically, the casual consistency model proposed by Deutsch seems to say that the circuitry as well as the inputs are mutable. Have I got something wrong?

  24. Job Says:

    Here’s a great form of non-volatile data storage and compression, get a really long thin string, the length of which (in some measurement units) gives the data.

    Then compress it as much as you can and carry out around. If you can compress the string without breaking it, then you can always get the data back – that’s valid data storage.

    Or better, if you don’t like carrying around string, carry a small device that throws a small projectile some distance – the distance representing the encoded value. 🙂

  25. Job Says:

    What am i saying, you can break the string if you want, as longs as you’re able to piece it back together without adding/removing distance. As a matter of fact, if you want to encrypt the string, then do cut it up into pieces and then throw a few extra ones in there which are invalidated by some function only you know.

  26. Jonathan Vos Post Says:

    Re: #23 “one implies completely static world lines; the other something that is ‘free’ in some sort of supertime.”

    Isn’t that a metaphysical conundrum attacked by Conway et al in the (revised) “Free Will” theorem, recently revisited in arXiv?

  27. anonymous Says:

    http://www.livescience.com/strangenews/080813-spooky-limit.html

    any comments?

  28. Jonathan Vos Post Says:

    The Gisin experiment of #26? Better described at:

    Entanglement remains a mystery with the New Scientist-y hype and scifi stripped away.

  29. Alter-Anon Says:

    anonymous,

    It was the expected result, and it’s supposedly further evidence against a transmission mechanism being behind quantum entanglement (it would have to be THAT much faster than the speed of light). However, I personally do not understand why >10,000c is so much more significant than >1c… i.e. how this ‘rules’ out physical theories with faster-than-light carriers anymore than earlier results. While I’m very much a non-believer in such theories… it seems pretty subjective.

  30. tover Says:

    anonymous, for traveling 18 km need time 18 (km) / 300000 (km/s) = 0.00006 s = 60 us. In article says, that detection time is 1/(3*10^14) s. But I wouldn’t be so sure about detection time, because it is hard to estimate it and detection time still can be some 0.0001 s… I can be wrong and then speed of ligh realy can be bigger or some faster comuncations…

  31. tover Says:

    btw, anonymous, usualy entanglement measurment going on after one of entangled photons have travel some few meters and then over photon don’t have much time for traveling long distance… Loosly speaking first photon is measured much more earlier by thousunds times than over entangled photon, because usualy source is in point A togheter with place of first photon measurment and only over photon must travel to point B some 18 km…

  32. tover Says:

    sorry, in this experiment point A and point B are separated 18 km and each point is separated 17.5 km from source of entangled photons, see Fig.4 here http://arxiv.org/pdf/0803.2425v1