Will have to look at the paper again to see what I’m missing but it is a little above my pay grade. This is just a hobby.

]]>I looked at the wiki page on this.

How is the relative QM interpretation different from decoherence? I think it has long been known that QM has a subjective element to it in that different observers can have different views of the world – the cat in the box for example. I think that is one of the motivations behind many-worlds. Decoherence allows us to analyze this in terms of the thermodynamic contact or lack of it between observers. This creates a kind of informational frame of reference where people in different informational frames can disagree on elements of reality. This seems just like what the relational interpretation is saying but with different words.

I’m not sure why he makes a point of rejecting the universal wave function. There may be practical difficulties with isolating yourself from the entire universe but if you could you would have to deal with the universal wave function. There is simply no mileage in worrying about it at all.

]]>Personally, while you can deny that a universal wavefunction is *useful* or *needed*, I don’t understand on what ground you could prevent someone else from considering it as a consistent mathematical construct, as long as you accept the universal validity of QM.

But that’s beside the point. For what I was saying above, it would suffice to retreat to the weaker claim: *if* there’s any sense in which the evolution of the universe is still deterministic, even in QM, *then* it’s that of unitary evolution of the universe’s wavefunction.

That assumes there is such a thing as a universal wavefunction. Roselli’s relative QM interpretation says no such universal wavefunction exists. That is simply to say there is no such thing as the universal “state of the universe” at any time T. Another way of saying there is no such thing as a universal privileged observer.

]]>Yeah. I read your paper last night and noticed that you used gzip to estimate the complexity of the coffee. The first thing that occurred to me is that you don’t need to estimate the complexity since you have a program that produced the output. It can act as an upper bound for complexity. All you have to do is specify the number of generations… yeah, log(t). So simple. I really should have clicked to this. I just wasn’t thinking about how small log(t) is.

]]>Reducing a person to their DNA is an imperfect analogy for reducing everything that happens in a given universe to its laws and initial state, since as you rightly point out, a person is not a closed system. But it fits the theme of trying to reduce a large unfolding process to a single originating seed. Sometimes that makes sense, but not when aspects of the process are explicitly what interests you (as is the case with evolution).

]]>I forgot to comment on one thing and I’m back because I think it is important.

…or only in the sense that your whole identity was “encoded” in your DNA at the moment of conception.

No, absolutely not. I am not a closed system. For example, a computer program connected to the internet can produce output with arbitrarily high Kolmogorov complexity simply by printing out random web pages. I can also decrease by entropy for the same reason.

]]>There are different notions of “information content” relevant here. If you want to talk about Kolmogorov complexity, then it actually CAN increase with time—albeit, only logarithmically.

Really? I take your word for it as you no doubt know this stuff far better than I. But I can’t for the life of me figure where my intuition is failing me. A trillion digits of pi, for example, cannot have a higher Kolmogorov complexity than the relatively small program that produced it. This follows simply from the definition of Kolmogorov complexity. This should be true of any program at all that produces any output at all.

In a deeper sense, Kolmogorov complexity should be strongly related to Shannon information. After all, taking a large string and deriving a shorter program to produce it is just data compression right? So where am I going wrong?

But the easiest way to dispel confusions—easier by far than arguing about abstract definitions of information content—is to consider a concrete example, like Conway’s Game of Life. If you started a life board in some simple initial state (like only 50 or 100 alive cells) and let it evolve for long enough, on a large enough board, it’s plausible that you’d eventually see complex structures evolve that were subject to Darwinian evolution. But were these creatures “encoded” in the initial state?

Well yes, obviously. (Actually, I think it is inconceivable that that small amount of information could produce that apparent complexity. But if it did… ) For example, if I start with the large state and derive a smaller state that creates it then that is just data compression. I have encoded the large state into a smaller state. I can only do this if the large state is not random and again we have the connection between Kolmogorov complexity and Shannon information. Both the Kolmogorov complexity and the Shannon information of the large state cannot exceed that of the smaller set. Again, what am I doing wrong?

Only in the sense, perhaps, that the proof of Fermat’s Last Theorem is “encoded” in the axioms of set theory,…

Again I would say yes, obviously. The choice of axioms is the seed that produces the vast chaotic and computationally intractable set of all that is true in that system as a vast fractal of truth. As an analogy, you can think about how a tiny program can produce the vast chaos of the Mandelbrot set.

Now the Mandelbrot set has been described as the most complex object in math. Yeah, well… anyway, I think it was Chaitin who pointed out that the study of chaos is the study of simplicity rather than complexity. It is the study of how objects like the Mandelbrot set that look very complex can have very simple structures. He points out that Kolmogorov complexity is the study of complex looking objects that are incompressible and so are really complex. That is the real study of complexity.

But my intuition seems to be leading me wrong somewhere.

]]>But the easiest way to dispel confusions—easier by far than arguing about abstract definitions of information content—is to consider a concrete example, like Conway’s Game of Life. If you started a life board in some simple initial state (like only 50 or 100 alive cells) and let it evolve for long enough, on a large enough board, it’s plausible that you’d eventually see complex structures evolve that were subject to Darwinian evolution. But were these creatures “encoded” in the initial state? Only in the sense, perhaps, that the proof of Fermat’s Last Theorem is “encoded” in the axioms of set theory, or only in the sense that your whole identity was “encoded” in your DNA at the moment of conception. I.e., only in a—dare I use the word?—reductive sense, one that short-circuits the entire story by which the thing we’re trying to explain came into being, after that story’s first few lines.

And yes, JimV #27, of course it’s true that in our quantum-mechanical universe, the evolution of what we see around us is fundamentally indeterministic—it’s only the evolution of the entire wavefunction that’s deterministic. By talking about what would still be true *even if* the world had been classical, I was trying to make the conclusion only stronger.