## Is “information is physical” contentful?

“Information is physical.”

This slogan seems to have originated around 1991 with Rolf Landauer. It’s ricocheted around quantum information for the entire time I’ve been in the field, incanted in funding agency reports and popular articles and at the beginnings and ends of talks.

But what the hell does it mean?

There are many things it’s *taken* to mean, in my experience, that don’t make a lot of sense when you think about them—or else they’re vacuously true, or purely a matter of perspective, or not faithful readings of the slogan’s words.

For example, some people seem to use the slogan to mean something more like its converse: “physics is informational.” That is, the laws of physics are ultimately not about mass or energy or pressure, but about bits and computations on them. As I’ve often said, my problem with that view is less its audacity than its timidity! It’s like, what would the universe have to do in order *not* to be informational in this sense? “Information” is just a name we give to whatever picks out one element from a set of possibilities, with the “amount” of information given by the log of the set’s cardinality (and with suitable generalizations to infinite sets, nonuniform probability distributions, yadda yadda). So, as long as the laws of physics take the form of telling us that some observations or configurations of the world are possible and others are not, or of giving us probabilities for each configuration, *no duh* they’re about information!

Other people use “information is physical” to pour scorn on the idea that “information” could mean anything without some actual physical instantiation of the abstract 0’s and 1’s, such as voltage differences in a loop of wire. Here I certainly agree with the tautology that in order to *exist physically*—that is, be embodied in the physical world—a piece of information (like a song, video, or computer program) does need to be embodied in the physical world. But my inner Platonist slumps in his armchair when people go on to assert that, for example, it’s meaningless to discuss the first prime number larger than 10^{10^125}, because according to post-1998 cosmology, one couldn’t fit its digits inside the observable universe.

If the cosmologists revise their models next week, will this prime suddenly burst into existence, with all the mathematical properties that one could’ve predicted for it on general grounds—only to fade back into the netherworld if the cosmologists revise their models again? Why would *anyone* want to use language in such a tortured way?

Yes, brains, computers, yellow books, and so on that encode mathematical knowledge comprise only a tiny sliver of the physical world. But it’s equally true that the physical world we observe comprises only a tiny sliver of mathematical possibility-space.

Still other people use “information is physical” simply to express their enthusiasm for the modern merger of physical and information sciences, as exemplified by quantum computing. Far be it from me to temper that enthusiasm: rock on, dudes!

Yet others use “information is physical” to mean that the *rules* governing information processing and transmission in the physical world aren’t knowable *a priori*, but can only be learned from physics. This is clearest in the case of quantum information, which has its own internal logic that generalizes the logic of classical information. But in some sense, we didn’t need quantum mechanics to tell us this! *Of course* the laws of physics have ultimate jurisdiction over whatever occurs in the physical world, information processing included.

My biggest beef, with *all* these unpackings of the “information is physical” slogan, is that none of them really engage with any of the deep truths that we’ve learned about physics. That is, we could’ve had more-or-less the same debates about any of them, even in a hypothetical world where the laws of physics were completely different.

So then what *should* we mean by “information is physical”? In the rest of this post, I’d like to propose an answer to that question.

We get closer to the meat of the slogan if we consider some actual physical phenomena, say in quantum mechanics. The double-slit experiment will do fine.

Recall: you shoot photons, one by one, at a screen with two slits, then examine the probability distribution over where the photons end up on a second screen. You ask: does that distribution contain alternating “light” and “dark” regions, the signature of interference between positive and negative amplitudes? And the answer, predicted by the math and confirmed by experiment, is: *yes, but only if the information about which slit the photon went through failed to get recorded anywhere else in the universe, other than the photon location itself.*

Here a skeptic interjects: but that *has* to be wrong! The criterion for where a physical particle lands on a physical screen can’t possibly depend on anything as airy as whether “information” got “recorded” or not. For what counts as “information,” anyway? As an extreme example: what if God, unbeknownst to us mortals, took divine note of which slit the photon went through? Would *that* destroy the interference pattern? If so, then every time we do the experiment, are we collecting data about the existence or nonexistence of an all-knowing God?

It seems to me that the answer is: *insofar as the mind of God can be modeled as a tensor factor in Hilbert space, yes, we are.* And crucially, if quantum mechanics is universally true, then the mind of God would *have* to be such a tensor factor, in order for its state to play any role in the prediction of observed phenomena.

To say this another way: it’s obvious and unexceptionable that, by observing a physical system, you can often learn something about what information must be in it. For example, you need never have heard of DNA to deduce that chickens must somehow contain information about making more chickens. What’s much more surprising is that, in quantum mechanics, you can often deduce things about what information *can’t* be present, anywhere in the physical world—because if such information existed, even a billion light-years away, it would necessarily have a physical effect that you don’t see.

Another famous example here concerns identical particles. You may have heard the slogan that “if you’ve seen one electron, you’ve seen them all”: that is, apart from position, momentum, and spin, every two electrons have *exactly* the same mass, same charge, same every other property, including even any properties yet to be discovered. Again the skeptic interjects: but that *has* to be wrong. Logically, you could only ever confirm that two electrons were *different*, by observing a difference in their behavior. Even if the electrons had behaved identically for a billion years, you couldn’t rule out the possibility that they were actually different, for example because of tiny nametags (“Hi, I’m Emily the Electron!” “Hi, I’m Ernie!”) that had no effect on any experiment you’d thought to perform, but were visible to God.

You can probably guess where this is going. Quantum mechanics says that, no, you *can* verify that two particles are perfectly identical by doing an experiment where you swap them and see what happens. If the particles are identical in all respects, then you’ll see quantum interference between the swapped and un-swapped states. If they aren’t, you won’t. The *kind* of interference you’ll see is different for fermions (like electrons) than for bosons (like photons), but the basic principle is the same in both cases. Once again, quantum mechanics lets you verify that a specific type of information—in this case, information that distinguishes one particle from another—was *not* present anywhere in the physical world, because if it were, it would’ve destroyed an interference effect that you in fact saw.

This, I think, already provides a meatier sense in which “information is physical” than any of the senses discussed previously.

But we haven’t gotten to the filet mignon yet. The late, great Jacob Bekenstein will forever be associated with the discovery that information, wherever and whenever it occurs in the physical world, *takes up a minimum amount of space*. The most precise form of this statement, called the covariant entropy bound, was worked out in detail by Raphael Bousso. Here I’ll be discussing a looser version of the bound, which holds in “non-pathological” cases, and which states that a bounded physical system can store at most A/(4 ln 2) bits of information, where A is the area in Planck units of any surface that encloses the system—so, about 10^{69} bits per square meter. (Actually it’s 10^{69} *qubits* per square meter, but because of Holevo’s theorem, an upper bound on the number of qubits is also an upper bound on the number of classical bits that can be reliably stored in a system and then retrieved later.)

You might have heard of the famous way Nature enforces this bound. Namely, if you tried to create a hard drive that stored more than 10^{69} bits per square meter of surface area, the hard drive would necessarily collapse to a black hole. And from that point on, the information storage capacity would scale “only” with the area of the black hole’s event horizon—a black hole itself being the densest possible hard drive allowed by physics.

Let’s hear once more from our skeptic. “Nonsense! *Matter* can take up space. *Energy* can take up space. But information? Bah! That’s just a category mistake. For a proof, suppose God took one of your black holes, with a 1-square-meter event horizon, which already had its supposed maximum of ~10^{69} bits of information. And suppose She then created a bunch of new fundamental fields, which didn’t interact with gravity, electromagnetism, or any of the other fields that we know from observation, but which had the effect of encoding 10^{300} new bits in the region of the black hole. Presto! An unlimited amount of additional information, exactly where Bekenstein said it couldn’t exist.”

We’d like to pinpoint what’s wrong with the skeptic’s argument—and do so in a self-contained, non-question-begging way, a way that doesn’t pull any rabbits out of hats, other than the general principles of relativity and quantum mechanics. I was confused myself about how to do this, until a month ago, when Daniel Harlow helped set me straight (any remaining howlers in my exposition are 100% mine, not his).

I believe the logic goes like this:

- Relativity—even just Galilean relativity—demands that, in flat space, the laws of physics must have the same form for all inertial observers (i.e., all observers who move through space at constant speed).
- Anything in the physical world that varies in space—say, a field that encodes different bits of information at different locations—also varies in
*time*, from the perspective of an observer who moves through the field at a constant speed. - Combining 1 and 2, we conclude that
*anything that can vary in space can also vary in time*. Or to say it better, there’s only one kind of varying: varying in spacetime. - More strongly, special relativity tells us that there’s a specific numerical conversion factor between units of space and units of time: namely the speed of light, c. Loosely speaking, this means that if we know the
*rate*at which a field varies across space, we can also calculate the rate at which it varies across time, and vice versa. - Anything that varies across time carries energy. Why? Because this is essentially the
*definition*of energy in quantum mechanics! Up to a constant multiple (namely, Planck’s constant), energy is the expected speed of rotation of the global phase of the wavefunction, when you apply your Hamiltonian. If the global phase rotates at the slowest possible speed, then we take the energy to be zero, and say you’re in a vacuum state. If it rotates at the next highest speed, we say you’re in a first excited state, and so on. Indeed, assuming a time-independent Hamiltonian, the evolution of any quantum system can be fully described by simply decomposing the wavefunction into a superposition of energy eigenstates, then tracking of the phase of each eigenstate’s amplitude as it loops around and around the unit circle. No energy means no looping around means nothing ever changes. - Combining 3 and 5, any field that varies across space carries energy.
- More strongly, combining 4 and 5, if we know how
*quickly*a field varies across space, we can lower-bound how much energy it has to contain. - In general relativity, anything that carries energy couples to the gravitational field. This means that anything that carries energy necessarily has an observable effect: if nothing else, its effect on the warping of spacetime. (This is dramatically illustrated by dark matter, which is currently observable via its spacetime warping effect
*and nothing else*.) - Combining 6 and 8, any field that varies across space couples to the gravitational field.
- More strongly, combining 7 and 8, if we know how quickly a field varies across space, then we can lower-bound by how much it has to warp spacetime. This is so because of another famous (and distinctive) feature of gravity: namely, the fact that it’s universally attractive, so all the warping contributions add up.
- But in GR, spacetime can only be warped by so much before we create a black hole: this is the famous Schwarzschild bound.
- Combining 10 and 11, the information contained in a physical field can only vary so quickly across space, before it causes spacetime to collapse to a black hole.

Summarizing where we’ve gotten, we could say: *any information that’s spatially localized at all, can only be localized so precisely*. In our world, the more densely you try to pack 1’s and 0’s, the more energy you need, therefore the more you warp spacetime, until all you’ve gotten for your trouble is a black hole. Furthermore, if we rewrote the above conceptual argument in math—keeping track of all the G’s, c’s, h’s, and so on—we could derive a quantitative *bound* on how much information there can be in a bounded region of space. And if we were careful enough, that bound would be precisely the holographic entropy bound, which says that the number of (qu)bits is at most A/(4 ln 2), where A is the area of a bounding surface in Planck units.

Let’s pause to point out some interesting features of this argument.

Firstly, we pretty much needed the whole kitchen sink of basic physical principles: special relativity (both the equivalence of inertial frames and the finiteness of the speed of light), quantum mechanics (in the form of the universal relation between energy and frequency), and finally general relativity and gravity. All three of the fundamental constants G, c, and h made appearances, which is why all three show up in the detailed statement of the holographic bound.

But secondly, gravity only appeared from step 8 onwards. Up till then, everything could be said solely in the language of *quantum field theory*: that is, quantum mechanics plus special relativity. The result would be the so-called Bekenstein bound, which upper-bounds the number of bits in any spatial region by the *product* of the region’s radius and its energy content. I learned that there’s an interesting history here: Bekenstein originally deduced this bound using ingenious thought experiments involving black holes. Only later did people realize that the Bekenstein bound can be derived purely within QFT (see here and here for example)—in contrast to the holographic bound, which really *is* a statement about quantum gravity. (An early hint of this was that, while the holographic bound involves Newton’s gravitational constant G, the Bekenstein bound doesn’t.)

Thirdly, speaking of QFT, some readers might be struck by the fact that at no point in our 12-step program did we ever seem to need QFT machinery. Which is fortunate, because if we *had* needed it, I wouldn’t have been able to explain any of this! But here I have to confess that I cheated slightly. Recall step 4, which said that “if you know the rate at which a field varies across space, you can calculate the rate at which it varies across time.” It turns out that, in order to give that sentence a definite meaning, one uses the fact that in QFT, space and time derivatives in the Hamiltonian need to be related by a factor of c, since otherwise the Hamiltonian wouldn’t be Lorentz-invariant.

Fourthly, eagle-eyed readers might notice a loophole in the argument. Namely, *we never upper-bounded how much information God could add to the world, via fields that are constant across all of spacetime*. For example, there’s nothing to stop Her from creating a new scalar field that takes the same value everywhere in the universe—with that value, in suitable units, encoding 10^{50000} separate divine thoughts in its binary expansion. But OK, being constant, such a field would interact with nothing and affect no observations—so Occam’s Razor itches to slice it off, by rewriting the laws of physics in a simpler form where that field is absent. If you like, such a field would at most be a comment in the source code of the universe: it could be as long as the Great Programmer wanted it to be, but would have no observable effect on those of us living inside the program’s execution.

Of course, even before relativity and quantum mechanics, information had already been playing a surprisingly fleshy role in physics, through its appearance as *entropy* in 19^{th}-century thermodynamics. Which leads to another puzzle. To a computer scientist, the concept of entropy, as the log of the number of microstates compatible with a given macrostate, seems clear enough, as does the intuition for why it should increase monotonically with time. Or at least, to whatever extent we’re confused about these matters, we’re no *more* confused than the physicists are!

But then why should this information-theoretic concept be so closely connected to tangible quantities like temperature, and pressure, and energy? From the mere assumption that a black hole has a nonzero entropy—that is, that it takes many bits to describe—how could Bekenstein and Hawking have possibly deduced that it also has a nonzero temperature? Or: if you put your finger into a tub of hot water, does the heat that you feel somehow reflect *how many bits are needed to describe the water’s microstate*?

Once again our skeptic pipes up: “but surely God could stuff as many additional bits as She wanted into the microstate of the hot water—for example, in degrees of freedom that are still unknown to physics—without the new bits having any effect on the water’s temperature.”

But we should’ve learned by now to doubt this sort of argument. There’s no general principle, in our universe, saying that you can hide as many bits as you want in a physical object, without those bits influencing the object’s observable properties. On the contrary, in case after case, our laws of physics seem to be intolerant of “wallflower bits,” which hide in a corner without talking to anyone. If a bit is there, the laws of physics want it to affect other nearby bits and be affected by them in turn.

In the case of thermodynamics, the assumption that does all the real work here is that of *equidistribution*. That is, *whatever* degrees of freedom might be available to your thermal system, your gas in a box or whatever, we assume that they’re all already “as randomized as they could possibly be,” subject to a few observed properties like temperature and volume and pressure. (At least, we assume that in classical thermodynamics. Non-equilibrium thermodynamics is a whole different can of worms, worms that don’t stay in equilibrium.) Crucially, we assume this despite the fact that we might not even *know* all the relevant degrees of freedom.

Why is this assumption justified? “Because experiment bears it out,” the physics teacher explains—but we can do better. The assumption is justified because, as long as the degrees of freedom that we’re talking about all interact with each other, they’ve already had plenty of time to equilibrate. And conversely, if a degree of freedom *doesn’t* interact with the stuff we’re observing—or with anything that interacts with the stuff we’re observing, etc.—well then, who cares about it anyway?

But now, because the microscopic laws of physics have the fundamental property of *reversibility*—that is, they never destroy information—a new bit has to go *somewhere*, and it can’t overwrite degrees of freedom that are already fully randomized. This is why, if you pump more bits of information into a tub of hot water, while keeping it at the same volume, the new bits have nowhere to go except into pushing up the energy. Now, there are often ways to push up the energy other than by raising the temperature—the concept of specific heat, in chemistry, is precisely about this—but if you need to stuff more bits into a substance, at the cost of raising its energy, certainly one of the obvious ways to do it is to describe a greater range of possible speeds for the water molecules. So since that *can* happen, by equidistribution it typically *does* happen, which means that the molecules move faster on average, and your finger feels the water get hotter.

In summary, our laws of physics are structured in such a way that *even pure information often has “nowhere to hide”*: if the bits are there at all in the abstract machinery of the world, then they’re forced to pipe up and have a measurable effect. And this is not a tautology, but comes about only because of nontrivial facts about special and general relativity, quantum mechanics, quantum field theory, and thermodynamics. And this is what I think people should mean when they say “information is physical.”

Anyway, if this was all obvious to you, I apologize for having wasted your time! But in my defense, it was never explained to me quite this way, nor was it sorted out in my head until recently—even though it seems like one of the most basic and general things one can possibly say about physics.

**Endnotes.** Thanks again to Daniel Harlow, not only for explaining the logic of the holographic bound to me but for several suggestions that improved this post.

Some readers might suspect circularity in the arguments we’ve made: are we merely saying that “any information that has observable physical consequences, has observable physical consequences”? No, it’s more than that. In all the examples I discussed, the magic was that we inserted certain information into our *abstract mathematical description* of the world, taking no care to ensure that the information’s presence would have any observable consequences whatsoever. But then the principles of quantum mechanics, quantum gravity, or thermodynamics *forced* the information to be detectable in very specific ways (namely, via the destruction of quantum interference, the warping of spacetime, or the generation of heat respectively).

Comment #1 July 20th, 2017 at 8:54 pm

This is of course wonderful, and a great way of thinking about how the physicality of information manifests in the real world.

I have a slightly simpler (simplistic?) way of answering the basic question of why we should think information is physical at all. Namely, ask yourself: is *energy* physical? Most people would probably say yes. But energy is a quality, not a substance– it’s not that something is energy, it’s that it has a certain amount of energy. We don’t need to ever mention energy when solving the basic equations of physics, if we don’t have to. But it’s enormously useful to do so– it gives us insight we wouldn’t otherwise have. Information is exactly the same. Casting processes in terms of information makes certain physical processes much more clear.

I suspect people might be reluctant to think that way because energy seems like an objective quantity (though it’s actually not– the energy of a particle depends on the frame in which you observe it), while information seems subjective, in the sense that we may or may not have information about something, regardless of what it’s doing. That’s why the idea of mutual information is really useful.

Comment #2 July 20th, 2017 at 9:21 pm

Sean–

You write that “energy is a quality, not a substance,” saying that “We don’t need to ever mention energy when solving the basic equations of physics, if we don’t have to.”

This may be true in Newtonian mechanics, where energy is a convenient mathematical tool that one does not, in principle, ever have to use.

But how does one do GR without mentioning energy? Gravity really seems to care about energy-momentum as a physical thing. That’s what gravity couples to, after all! The energy-momentum tensor that appears on the right-hand side of the Einstein field equation seems important and not something we could choose not to invoke. And, yes, energy alone isn’t Lorentz invariant, but energy-momentum is certainly a meaningful quantity in both SR and GR.

Comment #3 July 20th, 2017 at 9:24 pm

Wonderfully thorough! But I suspect hard for many to reproduce. Can we tweet it as information is physical because it produces measurable change in a physical system?

Comment #4 July 20th, 2017 at 9:27 pm

Sean #1: Right, what I’m trying to do here is answer the skeptic who says that information can’t possibly be physical in the same way energy is, because unlike energy, the “information” we ascribe to a system just increases without bound as new facets of the system are discovered, with no way to “weigh” the system to find out whether we’re even close to accounting for all of it. I was trying to explain how, just like mass and energy can’t hide because they always gravitate, so localized information can’t hide either, because it always carries energy!

(I know you know all this; I’m just explaining for others.)

Comment #5 July 20th, 2017 at 9:29 pm

Chris #3: I give you permission. 😉

Comment #6 July 20th, 2017 at 10:19 pm

What is the definition of “information” in this context?

Comment #7 July 20th, 2017 at 10:25 pm

Very interesting! I’d basically always interpreted it in the way Sean Carroll mentions.

I must nitpick here though:

You mean, to a computer scientist who is OK with the idea of having some implicitly preferred coarse-graining. 🙂

(I know, I know, you have said before that one can probably infer one from locality, but still…)

Comment #8 July 20th, 2017 at 10:49 pm

So when I have heard and said “Informoation is physical”, I was definitely thinking about the later examples not the former, but definitely in a fuzzy, lazy sort of way. Cheers on the explanation that is both thorough and clear.

Comment #9 July 20th, 2017 at 10:50 pm

An excellent essay. Wojciech long ago impressed on me the deepness of the (strictly quantum) fact that things decohere when *anything* finds out about them, but I don’t think I’d noticed the parallels to stat mech. One nitpick though:

> Why is this assumption justified? … because, as long as the degrees of freedom that we’re talking about all interact with each other, they’ve already had plenty of time to equilibrate. And conversely, if a degree of freedom doesn’t interact with the stuff we’re observing…well then, who cares about it anyway?

By eliminating the middle group, you’ve basically assumed away the entire field of non-equilibrium statistical mechanics. This is better than retreating to “because experiment says so”, but only a bit better.

Comment #10 July 20th, 2017 at 11:10 pm

epistememe #6:

What is the definition of “information” in this context?

For most of what I was talking about, “log of the number of distinguishable states” works perfectly fine. Or in quantum contexts, one could say log of the Hilbert space dimension. Except that, when I discussed the double-slit experiment, I meant mutual information with the degree of freedom in question (which slit the photon went through).

Comment #11 July 20th, 2017 at 11:20 pm

I tend to agree with the skeptic.

In the double slit experiment, Bohmian mechanics describes particles following an actual trajectory, which is information which could, in principle, be known and not destroy the interference. I’m not a fan of any particular interpretation, but I just don’t see why knowledge must *necessarily* cause collapse.

If that we insisted that anything knowing which slit the photon went through must obey quantum mechanics, and physically making a measurement, then sure, in any interpretation that will cause collapse. But then haven’t we just made the whole argument circular? Haven’t we just *assumed* that the thing doing the measuring must be physical?

Comment #12 July 20th, 2017 at 11:23 pm

Jess #9: Alright, I’ll add something tomorrow morning about how I was only talking about equilibrium stat mech—i.e. just trying to understand how Boltzmann and Gibbs could possibly have passed from the study of heat, to what today we would call “the number of bits.” And that the answer has to do with the fact that, in the types of systems they cared about, ones in thermal equilibrium, added bits have only a limited number of places to go

other thaninto more heat.Comment #13 July 20th, 2017 at 11:46 pm

Scott #12: For what it’s worth, I’m sure there’s a rough story to be told about how, as the number of particles goes to infinity, each degrees of freedom in the universe is “very likely” (or something) to be either fully thermalized with or effectively decoupled from the system of interest. After all, it’s not an accident that thermodynamic is so widely applicable. But you should ask someone more informed than me.

Comment #14 July 21st, 2017 at 1:27 am

I am not a physicist, so I am probably way out of my league here. But how can you bound the information in a region by its surface area? I would think that a region contains at least as much information as a subset of the region, but a subset may have larger surface area. If I take a big ball of space and crumple it a lot so that it fits in a smaller ball, shouldn’t the small ball contain all of the information in the crumpled region?

Comment #15 July 21st, 2017 at 1:41 am

Having thought deeply about thermodynamics in graduate school, I have a different interpretation of Landauer’s “information is physical.”

My interpretation is: “The more information you have about a system, the more work you can extract from it.”

Consider the following three scenarios:

(1) If you have complete knowledge of a box of gas particles (a la Maxwell’s demon), you can, in principle, design an engine that extracts 100% of the energy as useful work.

(2) If you have incomplete knowledge of a box of gas particles (e.g., one side is hotter than the other), you can design an engine to extract some of its energy as work, while some energy remains trapped as inextractable energy. No matter how clever your engine, on average some energy will be impossible to extract.

(3) If you have zero knowledge of a box of gas particles, you cannot, on average, extract any energy from it. (Assuming volume is fixed.)

All this is a long way of saying that the Gibbs free energy depends on the information known about the system. The more you know about a system, the more you can, in principle, extract its energy. In this sense, information turns energy into free energy. And, in the sense that it can generate free energy, information is physical.

That is how I interpret Landauer’s “information is physical.”

Comment #16 July 21st, 2017 at 2:25 am

Ted #15: Thanks; I hadn’t heard that interpretation! (Of course, it’s also true that the more information I have about a system, the more I might know about where to hit it with a hammer to break it… 😉 )

Comment #17 July 21st, 2017 at 4:15 am

I agree to your criticism on the weird and vague ways the slogan “information is physical” is used sometimes.

> quantum information, which has its own internal logic that generalizes the logic of classical information

I’d like to learn more about that. Could you ellaborate that further or provide me with a link, please?

Concerning the second part:

OK, within our models you can arrive at a maximum density of information. But our finest models are always necessarily inaccurate to some degree. So I boldly predict that our next model after quantum field theory will allow for a higher information density and the next one a higher yet. I know about Bell’s theorem and some ways around it by the way. Gerard t’Hooft makes these points well IMHO.

Even if we can’t measure some “Wallflower bits” right now, their interactive contribution might be so small as to only show up in the next model. We can often neglect Wallflower bits like for example the friction coefficient in classical mechanics when throwing a stone. The whole reason why entropy or equivalently information are relevant in physics is that more likely configurations are more likely to be realized (a tautology I know but true nonetheless). In the second law of thermodynamics we simply say something is so unlikely that we can safely neglect it for our purposes. (Namely a shift from an extremely likely state to an extremely unlikely one – a decrease of entropy.) What bits we associate with a physical system depends on our model. As you say: “log of the number of distinguishable states” *Distinguishable* depends on the accuracy of our measurement devices. By the way, saying *indistinguishable in principle* is just saying *indistinguishable in our model* and therefore adds nothing new.

Comment #12:

> added bits have nowhere to go except into more heat.

Please keep in mind that when increasing the microstates of a system without doing work on it the temperature would actually *de*crease. (Q=TS)

@Ted Comment #15:

> The more information you have about a system, the more work you can extract from it.

*Very* interesting! Thank you! I think, I agree. Actually, I just found that Wikipedia says pretty much the same: “Another way of phrasing Landauer’s principle is that if an observer loses information about a physical system, the observer loses the ability to extract work from that system.” But maybe I wouldn’t have understood that without your explanation. So, thanks again!

Comment #18 July 21st, 2017 at 4:16 am

Let me link to my Comment #62 on “THE TALK”: My quantum computing cartoon with Zach Weinersmith about Lienhard Pagel’s book “Information ist Energie”. Short quote for context: “By information, the author means his definition of dynamic information, which is something like information(flow) per time. Higher energy is normally associated with shorter times.”

He doesn’t talk about space or general relativity though, but he tries to expose concrete meanings of his stated claim.

Comment #19 July 21st, 2017 at 4:24 am

Jair #14: Yep, that’s precisely the surprising thing about this holographic bound! You’d expect the maximum information content to scale with volume, but instead it scales only with surface area, because if you try to pack information too densely into a given volume, it collapses to form a black hole.

But crucially, keep in mind that the holographic bound is only an

upper bound. Thus, if we took the surface area of your crumpled ball, that would indeed upper-bound the number of bits inside the crumple. It’s just that the surface area of a sphere surrounding the crumple would yield a tighter upper bound.(An interesting side note: for systems other than black holes, the holographic bound isn’t even close to saturated! For things like hard disks built out of massive particles, the maximum number of bits grows only like the

radiusin Planck units, while for a soup of radiation, the maximum number of bits grows like the radius^{3/2}. See Bousso for details.)Comment #20 July 21st, 2017 at 4:49 am

Charles #11: I don’t agree with the claim that in Bohmian mechanics, “which slit the photon went through” can in principle be known by an outside observer.

It’s true that, in Bohmian mechanics, there’s a hidden variable that encodes which slit the photon “really” goes through. And it’s true that we could imagine God hard-coding the value of that hidden variable into a piece of paper that’s in your hand.

But now suppose you tried to measure the photon, to check whether God had told you the truth. In that case, you’d again find the photon going through the left slit or the right slit with equal probabilities! For if you didn’t—if you always found the photon going through the slit that God wrote on the paper—then the quantum state you should’ve ascribed to the photon would not have been an equal superposition over the two slits, contrary to our original assumption.

The upshot is that the “trajectories” in Bohmian mechanics can

onlyhave metaphysical significance. As soon as you try to operationalize them, by imagining the trajectories are known to an outside observer, either you break the whole structure of quantum mechanics, or else you find that actual measurement outcomes are inconsistent with the supposed “knowledge.”As for the charge of circularity, it would indeed be justified if we assumed that the only way to learn which slit the photon went through was to perform a Copenhagen-style collapsing measurement. But we don’t assume that. Instead, we can simply start from the assumption that there’s any other degree of freedom, anywhere else in the universe, that’s entangled with the photon’s position—how it became entangled, we don’t know or care—and then calculate the photon’s reduced density matrix, and we’ll see that the interference pattern is gone.

Comment #21 July 21st, 2017 at 5:52 am

beautifulmathuncensored #17: See here for John Preskill’s intro to quantum information theory. Many other good surveys are just a Google search away.

The entire point of the argument I spelled out is that, if there

are“wallflower bits” that violate the limits on information density, then either the finiteness of the speed of light, or special or general relativity, or E=hv—so at any rate,some fundamental principle of physics that’s remained unchanged for more than a hundred years—is wrong. That’s not impossible but it makes this exceedingly high-stakes.‘t Hooft’s “way around” Bell’s Theorem is utter nonsense: it relies on a cosmic conspiracy in the initial state that’s a billion times worse (and more nonlocal!) than the disease it’s trying to cure. For more, see for example my American Scientist article or many past discussions on this blog.

Comment #22 July 21st, 2017 at 6:18 am

Scott.

you wrote

>> The criterion for where a physical particle lands on a physical screen can’t possibly depend on anything as airy as whether “information” got “recorded” or not.

and then did a great job on the “information” part, but you skipped over the “recorded” piece.

I think the reason is that this leads us straight into the measurement problem aka the interpretation problem.

What exactly does “recorded” (with its sense of irreversibility)

mean in quantum theory?

Comment #23 July 21st, 2017 at 6:24 am

wolfgang #22: Formally, all I meant was, whether there’s any other degree of freedom that’s entangled with the photon position. As I said in #20, the “recording” doesn’t need to be irreversible, and there’s no restriction on how it can happen—so all the difficulties around the measurement problem simply aren’t relevant for this particular discussion.

Comment #24 July 21st, 2017 at 6:39 am

First of all thank you for taking the time to respond to my comment.

> quantum information, which has its own internal logic that generalizes the logic of classical information

Sorry, I noticed later that I misread that statement. I had heard that one already. After all you can measure in a basis where you get deterministic results. I thought you were saying it was also possible to look at classical information as a generalization of quantum information. But all you can do is approximately simulate qubits, I guess.

On something you wrote in the American Scientist:

> Wolfram knew that such a model would imply that quantum mechanics was only approximate, so that (for example) quantum computers could never work.

IMHO quantum computers would still work as long as quantum mechanics is sufficiently accurate (as we know it is by measurements). But we can’t know for sure if qubit measurements after a Hadamard transform will still look “truely random” from the point of view of a future theory.

It wouldn’t be the first time in history scholars declared they “have found it all”:

https://www.quora.com/Which-19th-century-physicist-famously-said-that-all-that-remained-to-be-done-in-physics-was-compute-effects-to-another-decimal-place

Comment #25 July 21st, 2017 at 7:25 am

beautifulmathuncensored #24: There’s not even a question about whether physicists have “found it all”; obviously they haven’t. Rather, the question is in which direction to look for the new things.

Regarding the randomness of quantum measurement outcomes, the great implication of the Bell inequality is that,

ifthere’s any deterministic pattern behind it—known, unknown, computable, uncomputable, it doesn’t matter—then that pattern would have to be coordinated via faster-than-light signalling, which would mess up the whole causal structure of spacetime. So that’s the issue.Comment #26 July 21st, 2017 at 7:31 am

Scott,

then I don’t understand your point …

>> start from the assumption that there’s any other degree of freedom, anywhere else in the universe, that’s entangled with the photon’s position—how it became entangled, we don’t know or care—and then calculate the photon’s reduced density matrix, and we’ll see that the interference pattern is gone.

But this is always the case: A photon is entangled with its source; initially the atom where it originated and then this information is spread all over the laser as the atom collides with other atoms etc.

Finally the photon is entangled with the whole world …

But as long as this information is not “recorded” and actually not “recordable” the photon can be used to generate an interference pattern.

Comment #27 July 21st, 2017 at 7:35 am

Reader297 #2– You can easily do GR without ever mentioning the energ-momentum tensor. Just plug in the actual expression for Tuv in terms of the other fields etc of the theory. That would of course be incredibly inconvenient, which is the point.

Ted Sanders #15– Yes, this is the most straightforward argument for the physicality of information, and I was wrong not to mention it. Especially as now there are active experimental efforts that are making it happen.

http://www.nature.com/nphys/journal/v11/n2/full/nphys3230.html?foxtrotcallback=true

(Scott, will you be at David Wolpert’s Santa Fe workshop?)

Comment #28 July 21st, 2017 at 8:01 am

wolfgang #26: No, I only meant the single qubit of information corresponding to which slit the photon went through. If we observed maximal interference, then that information

can’thave been entangled with anything else, unless the entanglement was destroyed before the photon hit the second screen.Comment #29 July 21st, 2017 at 8:10 am

Sean #27: In that case, I’m confused as well! How would I say what I wanted to in this post—namely, that general relativity “eats everybody whose state changes over time”—without mentioning the general quality of having your state change over time under the action of the Hamiltonian of the world, which I understand from QM is called “energy”?

Alas, I’ll miss the Santa Fe workshop, because of a quantum supremacy workshop in Bristol, which I figured should have supreme priority! But I’m visiting NYU for most of the summer, so let me know if you pass through NYC.

Comment #30 July 21st, 2017 at 8:12 am

So here’s a quick argument that the argument for a bound on the amount of information you can store in a given (Riemannian) volume cannot be right. The amount of information stored on any Cauchy surface in GR is the same as the amount on any other, since the state of any Cauchy surface is entailed by the state of any other (+ the Einstein Field Equation). Fix a Cauchy surface, and mark out a circle on it (more exactly mark out the intersection of the Cauchy surface with the future light cone of some event p in its past). Now consider another Cauchy surface that coincides with the first everywhere outside the circle, but the part inside the circle is “pulled down” toward p, remaining space like but approaching the light cone. As that part approaches the light cone, its (Riemannian) volume approaches arbitrarily close to zero. But that bit has to contain exactly the same information as the part inside the circle of the original Cauchy surface (since the information inside the circle + the information outside the circle = the same total information in both cases). So I can put a fixed amount of information in an arbitrarily small volume. QED

Comment #31 July 21st, 2017 at 8:16 am

“Circle” really means “sphere”, of course. I just repressed a spatial dimension.

Comment #32 July 21st, 2017 at 8:23 am

@Scott,

>> I only meant the single qubit

ok, I got it.

Btw I found this stackexchange about double slit with entangled photons quite helpful:

physics.stackexchange.com/questions/103359/double-double-slit-experiment

Comment #33 July 21st, 2017 at 8:31 am

It’s not too hard to distinguish between “time-evolution operator” and “the energy.” (Maybe especially in GR, where the total Hamiltonian including gravity is just a constraint.) But doing this would be silly, which is the point!

This is poking at a deeper ontological issue of when a concept becomes sufficiently important to a way of talking about the world hat it becomes appropriate to say that it “exists.” But the basic idea isn’t too tricky: information deserves to be called physical because it can be used to influence stuff that we all agree is physical.

Comment #34 July 21st, 2017 at 8:33 am

Jair #14:

> I would think that a region contains at least as much information as a subset of the region, but a subset may have larger surface area.

Your crumpled region means curved spacetime therefore it has some amount of energy. Now you attempt to place it inside the larger region. But you can’t do this without changing the spacetime curvature of the larger region since you just added the energy of the crumpled region.

If the subset region is already inside the larger region when you attempt to crumble it then anything you do to the subset will necessarily affect the larger region too.

Comment #35 July 21st, 2017 at 8:36 am

This is a very nice argument, but along the lines of Jess’s comment, it’s important to note that it only gives an upper bound on the number of “wallflower bits”, and this upper bound is usually not tight (except maybe for black holes). There could be many species of particle we haven’t seen, new internal degrees of freedom of electrons, etc, but just out of reach because they require too much energy to access or because of some symmetry breaking.

Comment #36 July 21st, 2017 at 8:40 am

Scott #29, the energy in QFT is not an observable, insofar as it’s not locally defined (if there’s an extra something happening 10¹⁰⁰ meters away, your local measurement of what the energy is is screwed). As an overlay of empiricist and platonist, I take QFT to be better done without mentioning energy at all, by focusing on local measurement and dynamics instead of taking a globally defined Lagrangian (inter)action to deform a free QFT.

I haven’t read all your post yet, but your inner platonist seems pretty forcefully present in your comment that «the answer, predicted by the math and confirmed by experiment»; if one channels one’s empiricist heart, that seems not so much chicken and egg as horse before the cart.

Comment #37 July 21st, 2017 at 8:42 am

I never understood how arguments about information always seem to leave out the fact that information is actually subjective. This is why randomness is so difficult to define (i.e. Kosmogorov complexity, etc).

Information is a chain of mappings/isomorphisms between properties of different physical systems, eventually *always* reaching back to some pattern in our brain.

As an example, the body of a computer scientist and his computer are two physical systems which interactions can be totally understood with the fundamental laws of particle physics, yet, at the conscious level of the computer scientist’s brain, there are concepts such as software, which seems to be almost causal in their nature, but which can’t be identified at all at as a property of the hardware level (if I hand you some arbitrary piece of machinery, you can’t measure some absolute amount of “softwareness” that’s been “put into it”).

And when I write the very sentence “there are concepts such as software”, it all originates again from me, as a human with a subjective vision of the world.

A world without humans is a world without information?!

So it seems to me that the very concept of information can never be separated from the hard problem of consciousness.

Comment #38 July 21st, 2017 at 8:54 am

Sean #1

“But energy is a quality, not a substance”

But what is substance?

Just like “energy” it is nothing more that our “knowing” of various relative/subjective properties of patterns in the data streams of our senses, i.e. information.

The assumption that the nature of an ‘electron” is different from the nature of the number ‘786’ is arbitrary. Those two things are simply collections of mathematical properties to other such things, in a big circular way: we know fundamentally nothing more about an electron and the number 786 than what we can write about them in a dictionary.

The only things that don’t fit this are qualias, irreducible experiences of consciousness.

Comment #39 July 21st, 2017 at 8:56 am

Tim #30: Three quick replies.

1. The upper bound is actually on the number of bits in a region bounded by a given surface area, not a region with a given volume.

2. Having said that, I wonder whether you’re poking at some of the same issues that Bousso addressed in his landmark paper, where he avoided the known counterexamples by formulating the holographic entropy bound directly in terms of covariant objects (like lightcones) rather than spatial surfaces?

3. If you still disagree with Bousso’s covariant formulation of the bound, then I’d be very curious to know which of the 12 steps in my blog post is the (first) one that you think goes off the rails.

Comment #40 July 21st, 2017 at 8:58 am

Michael Musson #34:

Your crumpled region means curved spacetime therefore it has some amount of energy.

No, one could mathematically consider an arbitrarily crumpled region even of a flat spacetime.

Comment #41 July 21st, 2017 at 9:01 am

aram #35:

This is a very nice argument, but along the lines of Jess’s comment, it’s important to note that it only gives an upper bound on the number of “wallflower bits”, and this upper bound is usually not tight (except maybe for black holes).

Well, the argument about quantum interference shows that there can be

nowallflower bits that record the which-path information; while the thermodynamic argument says that, if there are any wallflower bits in the hot water, then they’re already maximally scrambled and therefore don’t matter.But yes, the holographic entropy bound gives only an extremely loose upper bound, with plenty of room for new particle species to be discovered.

Comment #42 July 21st, 2017 at 9:02 am

Fascinating and provocative!

In reinterpreting Landauer’s slogan, you may have also undermined part of his program. One of his formalisms for computing without dissipating energy involved a time-varying potential. Quoting from “Information Is Inevitably Physical” (1999):

Did Landauer ever consider the energy (and information!) embodied in that time-varying potential? I have not thoroughly searched his work to answer that question, but I suspect the answer is no.

Comment #43 July 21st, 2017 at 9:09 am

> “found it all”

Well, it kinda sounds like you are saying that at least for microphysics when you claim that quantum mechanics is infinitely accurate.

Concerning Bell:

There are a ton of papers on this topic and it would go too far here, I think.

https://en.wikipedia.org/wiki/Loopholes_in_Bell_test_experiments

fred #37 :

> information is actually subjective

I was going to say that, too. Please still let me add what I already wrote:

Alice has different information about a system than Bob has.

(And as Ted Sanders pointed out, they will therefore be able to extract different amounts of work from it. (To avoid the peripheral thought problem of the energy of the system being already gone, I suggest to think about it hypothetically or imagine two identical systems.))

That is also true for probabilities (at least for epistemological ones). In a way saying something is true with 0.8 probability means splitting the bit and introducing rational numbers to information theory.

Sean #27:

>http://www.nature.com/nphys/journal/v11/n2/full/nphys3230.html?foxtrotcallback=true

This sounds awesome! Man, I love this blog! ^^

Comment #44 July 21st, 2017 at 9:09 am

fred #37:

So it seems to me that the very concept of information can never be separated from the hard problem of consciousness.

If that were so, then how could there be all those textbooks of information theory, and electrical engineers quantifying the amounts of information flowing through communication networks, etc. etc., without ever making reference to any conscious entities?

The concept of information can be separated from the hard problem of consciousness in exactly the same way that the concepts of energy, heat, acidity, and prime numbers can be separated from the hard problem of consciousness.

Comment #45 July 21st, 2017 at 9:12 am

Brian #42: Thanks! I don’t know Landauer’s work well enough to answer your question, but hopefully someone else here does.

Comment #46 July 21st, 2017 at 9:20 am

beautifulmathuncensored #43: It’s possible that the framework of QM is exactly right,

but alsothat the problems of grand unification, strong CP, the origin of neutrino masses, the explanation of the Standard Model parameters, dark matter, dark energy, and (of course) early-universe cosmology and quantum gravity point to centuries’ worth of new microphysics to be discovered.Or, maybe quantum mechanics is wrong! But if so, no one today has any idea what to replace it with (certainly not ‘t Hooft), there’s not the slightest hint from any existing experiment, and it would have to be one of the greatest revolutions in the history of physics.

Comment #47 July 21st, 2017 at 9:50 am

Scott’s essay inspires us to imagine how our notions of “information” and “entropy” would differ in a world that was non-relativistic, yet otherwise quantum-orthodox.

We specify a quantum universe in which the Bohr radius and the Rydberg energy are unaltered, while the gravitational constant

G→ 0 and the velocity of lightc→ ∞.In this non-relativistic quantum world everything to do with electric fields works as always: this includes chemistry, biology, neurological cognition, metallic conductors, piezoelectric sensor/actuators, and chemical batteries. Even liquid-helium superfluids survive (including Josephson-physics at fluid weak-junctions). So non-relativistic physics laboratories are well-equipped.

On the other hand, in non-relativistic quantum universes, nothing that depends upon magnetic fields works at all … hence no photons, no magnets, and no electromagnetic motors. Ouch! Hence, to our gauge-relativistic eyes, the equipment in non-relativistic physics laboratories is guaranteed to look a little odd.

Also (and by deliberate construction) what’s common to both universes is that the causal, thermodynamic, and informatic tenets of the Church of the Larger Hilbert Space are scrupulously respected.

It’s fun to see the 20th century

Feynman Lecturesdeftly anticipating multiple crucial themes of the 21st century’s emergingyoga(to borrow Grothendieck’s term) of quantum measurement, information, simulation, and computation. 🙂Needless to say, many different answers to Exercise II can be given; here are three provocative suggestions:

Obviously, what’s more important than mere questions of “who’s right” versus “who’s wrong” in regard to the physical demonstrability (or not) of Quantum Supremacy, is the inarguable fact that, in affirming or refuting Quantum Theses 1-3, humanity wil learn a lot more about quantum physics than we know now. This is because, no matter how Theses 1-3 are eventually assessed, humanity is pretty much guaranteed to acquire transformational new computational capacities.

Hence the 21st century’s new quantum capabilities are certain to be exciting … and even scary! 🙂

Comment #48 July 21st, 2017 at 9:52 am

Scott: “Quantum mechanics says that, no, you can verify that two particles are perfectly identical by doing an experiment where you swap them and see what happens. If the particles are identical in all respects, then you’ll see quantum interference between the swapped and un-swapped states. If they aren’t, you won’t.”

I’ve heard this claim that physicists KNOW that all electrons are identical because quantum mechanics before, but I don’t believe it.

Before getting to why, I’ll note that I’ve also heard the same claim being justified by classical thermodynamics, without invoking quantum mechanics, on the grounds that you get the right expression for the entropy of a system by assuming identical particles. But I’m very sure that claim is wrong, as should anyone who’s written a molecular dynamics program. There is no change in the output of such a program (concerning observables) if you add extra stuff to the description of a particle, that differs from one to another, but which has no effect on dynamics. And there is only a tiny change if you make these extra properties of particles have only a tiny effect on the dynamics. Classical thermodynamics definitely doesn’t let you conclude that you’ve found out all the properties of particles. And indeed, I believe that ordinary atoms due indeed have lots of state (eg, in the nucleus) that 19th century physicists were ignorant of, and that is ignored by most molecular dynamics programs, since it has only a tiny effect (nuclear states of neighboring atoms interact very weakly).

Now, to get to the quantum mechanics version of this claim, it amounts to saying that physicists can be sure that the dimensionality of the space the wave function lives in isn’t bigger than they think. But how can they know this? The fact that they get what appear to be the right predictions (to experimental accuracy) with dimension N doesn’t guarantee that a model with dimension larger than N wouldn’t also work (and perhaps work better for different or more accurate experiments).

In experimental terms, the electrons in the double-slit experiment will be “prepared” in a certain way. If this leads to the wave function being spread out in the “extra” dimensions in the same way, I don’t see why one wouldn’t still see interference. From the fact of this interference, I see no guarantee that preparing electrons in some other way, that leads to different electrons differing in the extra dimensions, couldn’t produce different results.

Comment #49 July 21st, 2017 at 9:54 am

Scott #44

“If that were so, then how could there be all those textbooks of information theory, and electrical engineers quantifying the amounts of information flowing through communication networks, etc. etc., without ever making reference to any conscious entities?”

That’s actually the very point I was making.

You refer to “text books” and the entire body of our knowledge (physics, maths,…) has the property that it is nothing more than what we can write about it in textbooks, as a somewhat common denominator between big groups of humans – I say common denominator because it is the case that some of the intimate knowledge of chess of Kasparov, or some of the intimate knoweldge of mathematics of Ramanujan, or the intimate knowledge of Go of AlphaGo,… simply can’t be communicated to another human being through books or speech or any other means.

Textbooks, as a system, are all acting as a super dictionary, which is nothing but a circular series of definitions, i.e. symbols referring to one another in a big graph.

The only symbols that are fundamental and that don’t refer to others are actually particular to “being human”, qualias of consciousness of the human experience.

So, when you reduce it, this big body of textbooks only makes sense to us, as humans.

All the physical systems we’ve built, through engineering do not require our interpretation of the symbols in the textbooks to function: as a whole, the workings of the system humans+textbooks+machines can be explained by the interactions of all the fundamental particles within it.

Our bestowing special status to “textbooks” is arbitrary/subjective… it only makes sense to us as conscious humans and is not that universal – the laws of nature are assumed universal, but the isomorphisms between the “physical” worlds and our brain aren’t absolute.

We could therefore run into a totally alien civilization, which we would recognize as a system that seems complex and dynamic enough (e.g. an ant colony as a simple example), but we wouldn’t *necessarily* be ever able to split it into three “neat” sets “creatures + information records (texbooks) + machines” that would be meaningful to us (meaning/understanding == information). That is, we wouldn’t *necessarily* be able to find isomorphisms between patterns within this system and the pattern in our brain (because we’re not an ant or an ant colony or some arbitrary sub-system deep in there).

This doesn’t say that information isn’t a thing, simply that it’s very subjective/relative to our own nature.

Comment #50 July 21st, 2017 at 10:30 am

I actually always took the meat of “information is physical” to be a kind of conjecture about the viability of a particular research program: the idea was that all the deep resonances between information theory and QM/GR/QFT/etc. are evidence that we might be able to derive most of our physical theory from axioms that start with constraints on information. If laws that concern information determined the laws of physics, then there’s a certain sense in which the former is more fundamental than the latter. I suppose that’s another variant of “physics is informational” but it does dovetail well with your latter discussion because it’s a sort of explanation for why these things are so bound up together.

I spent a lot of time thinking about this in grad school, but I’ve sort of lost track of how it’s developed. I remember there were a number of toy-theories (e.g. from perimeter institute folks) that could be got from a few fairly sparse axioms and exhibited the most interesting features of quantum mechanics, but none that seemed to go all the way. I was always convinced that you could get most of QM with a kind of information exchange constraint that limited how much information different partitions of a system could have about each other’s state. But I never got the model working and had to go get a job.

If anyone’s familiar with any recent developments in that area, I’d love to hear about it.

Comment #51 July 21st, 2017 at 10:36 am

Another illustration of the relativity of information/knowledge is that our current engineering textbooks are just based off the current interpretation of the world.

In principle we can’t assume that the notions of energy, point particles,… will last any longer than older ideas on ether, or the world being made of water/earth/fire before that, etc.

It seems that every generation assumes their interpretations of the world is so sound that they forgot those are merely isomorphisms between structures of the world and structures in their brain.

Maybe we’ll reach soon a limit to human understanding and no longer find new interpretations, but that doesn’t mean that some super AI won’t be able to find new meanings that we won’t be able to grope – information that we can’t “understand” won’t be useful to us.

Having quasi-omniscient oracles, able to prove complex mathematical theories or come up with complex algorithms beyond what humans can “understand” will give a new perspective on all this.

It’s already happening with AlphaGo, where the human world champion, upon defeat declared: “AlphaGo is a completely different player. It is like a god of a Go player.”.

Comment #52 July 21st, 2017 at 10:44 am

Scott,

one more stupid question, this time about your statement

>> such a field would at most be a comment in the source code of the universe: it could be as long as the Great Programmer wanted it to be, but would have no observable effect on those of us living inside the program’s execution

What about physical constants like the e.m. fine structure constant or the cosmological constant etc. ?

Unless they turn out to be rational numbers, they would in some sense contain an infinite amount of information, although the visible universe is finite (as far as we know).

Would you assume that they change (at least in precision) as the universe expands?

Comment #53 July 21st, 2017 at 10:45 am

Scottsays (#45) “Brian (#42): Thanks! I don’t know Landauer’s work well enough to answer your question, but hopefully someone else here does.”Here are three references that (partially) address Brian’s question. The overallyogaof these references amounts to: “If information is physical, then computation is Hamiltonian.”The first reference is Charles Bennett’s 2003 survey “Notes on

Landauer’s principle, reversible computation, and Maxwell’s Demon”. Bennett provides an in-depth discussion of Rolf Landauer’s much-referenced 1989 article “Dissipation and noise immunity in computation, measurement, and communication”.

In turn, Landauer’s article references John von Neumann’s enabling disclosure — literally, in von Neumann’s posthumously-issued 1957 patent “Non-linear capacitance or inductance switching, amplifying and memory organs” — of a computational dynamics that is both formally Hamiltonian and technologically realizable. To the best of my knowledge, von Neumann’s patent is where the physical ideas of Landuauer’s quote (in Brian’s comment #42) were first published.

These three fine works provide ample material for juicy 21st century theses, at both the undergraduate and graduate level.

Examples follow.

———

Q1Would von Neumann’s Hamiltonian computational technology work at optical frequences as contrasted with microwave frequences? Yes.Q2Would the resulting math, physics, and engineering overlap substantially with BosonSampling math, physics, and engineering? Yes.Q3Would von Neumann’s ideas work in non-relativistic quantum universes (of my earlier comment) provided that electromagnetic modes were replaced by (e.g.) non-magnetic piezoelectromechanical modes? Yes.Q4Might this constellation of von Neumann-inspired ideas suggest further paths to scalable quantum computation and/or feasible quantum supremacy? Well heck, why not! 🙂———

References follow; have fun with them! 🙂

@article{Bennett:2003, Author =

{Bennett, C. H}, Title = {Notes on

Landauer's principle, reversible

computation, and Maxwell's Demon},

Journal = {Studies in History and

Philosophy of Science Part B -

Studies in History and Philosophy of

Modern Physics}, Number = {3}, Pages

= {501-510}, Volume = {34}, Year =

{2003}}

@article{Landauer:1989, Author =

{Landauer, Rolf}, Title =

{Dissipation and noise immunity in

computation, measurement, and

communication}, Journal = {J Stat

Phys Journal of Statistical Physics},

Number = {5-6}, Pages = {1509--1517},

Volume = {54}, Year = {1989}}

`@misc{Neumann:1957, Author = {John`

von Neumann}, Title = {Non-linear

capacitance or inductance switching,

amplifying and memory organs},

Howpublished = {US Patent 2815488},

Year = {1957}}

Comment #54 July 21st, 2017 at 11:06 am

In the section about the Bekenstein bound you talk about fields that don’t interact with the rest of the universe. But if we’re going to allow information like that (that can’t be accessed from our part of the universe) then I claim that every quantum system already has an infinite information capacity.

After all a qubit has an infinite number of possible pure states, so it takes an infinite number of bits to specify. The only reason we don’t usually treat them like this is because some ensembles aren’t distinguishable, like a 50/50 mix of |0> and |1> has the same density matrix as a mix of |+> and |->. But if we’re allowing “non-accesible” information then surely this already counts.

Comment #55 July 21st, 2017 at 11:13 am

fred: Do you also consider the concept of energy to be subjective and observer-relative—since (for example) different inertial observers will ascribe different kinetic energies to the same system? My point, and I guess Sean’s also, was that regardless of its relative aspects, information (in the form of, say, log Hilbert space dimension and von Neumann entropy) is just as hard to avoid as energy in discussions of fundamental physics. And just like with energy, clearly there are countless important physical questions involving information that have definite right answers: e.g., in this double-slit experiment, did the which-path information leak out into the environment or not? (Here the definition of “question with a definite right answer” is, something you could put on an undergrad physics exam. 😀 )

Comment #56 July 21st, 2017 at 11:22 am

wolfgang #52: In quantum mechanics, dimensionless constants like the fine-structure constant show up in expressions for the

amplitude(and hence the probability) with which a process occurs. They’re not explicitly stored anywhere, in such a way that you could read out their digits at will.So for example, to oversimplify grossly, maybe one could produce a state like

α|0⟩ + √(1-α

^{2})|1⟩,α being the fine-structure constant. But if so, we’d consider that to be just one qubit, not infinitely many classical bits. Note that, if you want to learn more and more digits of α, you’ll have to do so slowly and laboriously, by preparing more and more copies of the above state (or something like it) and measuring them.

Comment #57 July 21st, 2017 at 11:27 am

Oscar #54: That’s correct, there’s no limit to how much information there is that we can’t access! 😉

The tricky part is to distinguish, within each specific physical theory (GR, QFT, etc.), between what’s empirically accessible and what isn’t.

Comment #58 July 21st, 2017 at 11:30 am

Scott #16: Here is a longer argument along the lines of #15 connecting information and extractable free energy (esp. sec 5):

http://www.damtp.cam.ac.uk/user/tong/statphys/jaynes.pdf

Comment #59 July 21st, 2017 at 12:16 pm

@Scott #44

The subjectivity in information in electrical engineering is in the fact that we arbitrarily define which voltage we interpret as 1 (typically more than 5V) and which as 0.

Furthermore the amount of information flowing through information networks depends on the measurement accuracy of the reciever. For example you could take the (classical) polarization of a light wave as your signal. Now depending on whether you distinguish 2 or 4 different polarization directions the information carried in one time step is 1 or 2 bits.

But I also think we can tackle these problems without solving the mystery of consciousness.

(I wanted to stop after my last comment but I’m getting addicted ^^.)

Scott #54

Only after you have defined a correspondence between 1s, 0s and some physical quantity, you can ask “questions with definite right answers”. (Let’s take “definite right” to be defined as 5 sigma certainty.)

Comment #60 July 21st, 2017 at 12:19 pm

Scott,

>> They’re not explicitly stored anywhere, in such a way that you could read out their digits at will.

unfortunately again I don’t understand your point.

We can measure alpha and other dimensionless constants to good precision and I would think every experiment a physicist can do to determine alpha etc. is in a sense a quantum experiment.

Of course that precision is limited (perhaps ultimately by the Bekenstein bound, which a physicist cannot avoid in a laboratory of finite surface area), but my question is if those constants have well-defined values (with infinitely many digits) nevertheless or if in your point of view they had to be ill-defined e.g. at the big bang when the radius of the universe was small.

Comment #61 July 21st, 2017 at 12:34 pm

bmo #59:

Only after you have defined a correspondence between 1s, 0s and some physical quantity, you can ask “questions with definite right answers”.

Well, obviously. If your professor turns out to mean 0 by “1” and 1 by “0”, then you might not do great on the final exam. That’s exactly why we fix language conventions: the better to get at the objective reality of our shared universe.

Comment #62 July 21st, 2017 at 12:39 pm

Thanks for a great post, it illuminates quantum “weirdness” in an extremely clear fashion.

One point that you make though, that any degrees of freedom which influence physical systems are tensor factors in the Hilbert space, is only known by experiment, and might not be exactly true after all. For instance, in the Schrodinger-Newton equation, even if a particle is in a superposition of location states, its gravitational influence is not in superposition. Rather, |Ψ| acts as a source term for the gravitational field.

Since the force of gravity is so weak, we can’t exclude this theory, even though it’s hard to make sense of, contains no gravitons, etc.

I don’t know of a similar theory that adds non-quantum information to some QFT (the equation above is non-relativistic), but if it’s possible to create one than it’s no longer a logical truth that information behaves in the way QM leads us to believe.

Comment #63 July 21st, 2017 at 12:40 pm

wolfgang #60: (sigh) Let me try one more time. Every experiment can be modeled as involving at most a finite number of qubits, and its outcome gives you at most a finite number of bits about α. So at no time is the holographic entropy bound, properly understood, ever violated.

Note that naïvely, you might hope that you could learn n bits of α using ~n experiments. But because the task is vaguely like estimating the unknown bias of a coin by flipping it over and over, you unfortunately need more like exp(n) experiments. Indeed, that’s the reason why α

^{-1}is “only” known, at this time, to 12 significant figures, rather than (say) 12 billion.Comment #64 July 21st, 2017 at 12:49 pm

Scott #54

It often all devolves into a question of definition… Which is not surprising given the circularity nature of all human concepts!

On one hand, there is the amount of symbols/bits one can send through a noisy channel.

On the other hand, there’s the interpretations of the actual bits being sent through the said channel, which is all in the eye of the beholder.

You would agree that the amount of information in any particular bit sequence isn’t some absolute, with your own posts on the difficulty to quantify randomness (which is the flip side of information).

I still maintain that it all always comes back eventually to the problem of consciousness.

Given the belief in some reality that’s independent of our consciousness, there are many possible isomorphisms between patterns of data coming from that reality and patterns inside our finite brains. Some isomorphisms/abstractions/models are more “successful” than others – the ones that give our ape brain a bump in chances of survival.

A “chair” is a very useful concept to us humans (a sort of invariant), and we can quantify it pretty accurately, we even have entire books dedicated to it… but that doesn’t mean that physical reality needs to embed the concept of “chair” within its fundamental laws to proceed.

The paradox is – why do we feel so strongly about the concept of a “chair” if our own brain is of the same nature as the outside reality, i.e. big blobs of atoms that do not need consciousness to just “proceed”? (the existence of textbooks is merely an extension of our brain structures).

I feel it’s really important to recognize the inherent limitations of our human nature – given the finiteness of our ape brain, can it still “understand” arbitrary complex relationships that are “out there”? (and given that our experience of the world is always indirect, our brain would have to first come up with arbitrarily complex ways to extract data from the world).

Asking “Is information physical?” is really another way to ask “Is our universe mathematical in nature?” and “Can emergent (concepts) be causal”, or inquiring about “The nature of the mathematical problems that are solvable by humans”.

Comment #65 July 21st, 2017 at 12:52 pm

Radford Neal #48: No, there’s no assumption needed about the Hilbert space not having a larger dimension than you think it does. I meant what I said: that assuming the basic framework of QM, the experiment tells you whether the two electrons are identical, full stop.

It’s like this: let |e

_{1}⟩|e_{2}⟩ be our two adjacent electrons. We prepare a control qubit in the state (|0⟩+|1⟩)/√2. Then, conditioned on the control qubit being |1⟩, we swap the electrons, to produce the state |e_{2}⟩|e_{1}⟩. Finally we measure the control qubit in the {|0⟩+|1⟩,|0⟩-|1⟩} basis to see whether interference has occurred.If e

_{1}and e_{2}are identical in all respects, then the rules for identical fermions imply that |e_{2}⟩|e_{1}⟩ = – |e_{1}⟩|e_{2}⟩. Therefore the control qubit is left in the (|0⟩-|1⟩)/√2 state, and that outcome is observed with certainty.If, on the other hand, e

_{1}and e_{2}have secret labels that perfectly distinguish them—doesn’t matter in which Hilbert space dimensions, known or unknown—then |e_{1}⟩|e_{2}⟩ and |e_{2}⟩|e_{1}⟩ are orthogonal. Therefore the measurement of the control qubit yields both outcomes, |0⟩+|1⟩ and |0⟩-|1⟩, with equal probabilities.I completely agree that the analogous claim about classical thermodynamics is wrong,

unlessquantum mechanics somehow gets smuggled in the back door (e.g., if you do the counting using Bose statistics).Comment #66 July 21st, 2017 at 12:53 pm

@Scott

>> sigh

the reason I am puzzled is because you seem to make a distinction between e.g. pi and alpha.

In your Platonic world pi exists with infinitely many digits, but not alpha.

I guess this difference disappears as soon as e.g. a clever string theorist would find a way to calculate alpha ?

Notice, that without some clever Greek mathematicians we might still be approximating pi empirically as the ratio of measured circumference and measured diameter of some circles drawn in the sand (with whatever precision the Bekenstein bound allows 8-).

Comment #67 July 21st, 2017 at 1:04 pm

wolfgang #64: I don’t know whether α even makes sense to infinitely many digits (note that it’s “merely” the low-energy limit of a running coupling constant, whose more fundamental behavior presumably occurs at extremely high energies).

If it does make sense, I don’t know whether there’s any explicit formula for α (say, an infinite series), or any such formula that will ever be known to humankind.

For π, of course, there are many such formulas, which is one obvious difference between it and α. (Note also that the first n digits of π can be computed in time that’s nearly linear in n, compared to the exp(n) experimental effort that’s needed to learn the first n digits of α by any known method.)

In any case, no violation of Bekenstein no matter how you slice it. 😀

Comment #68 July 21st, 2017 at 1:09 pm

By putting the emphasis on humans, one could say that information is definitely physical because the physical world is spontaneously organizing itself into complex structures that argue endlessly about the nature of information 😛

(last post, I promise).

Comment #69 July 21st, 2017 at 1:10 pm

Amir #62: Yes, if quantum mechanics were only approximate, then much of what I said in this post is probably wrong, along with the whole edifice of modern physics. See my comment #46.

Comment #70 July 21st, 2017 at 1:32 pm

Scott #65: I’d agree that the two electrons in the experiment showing interference were identical (actually, I think it’s enough for them to be nearly identical, given finite experimental accuracy). But I think people want to claim something much stringer – that all electrons are identical, not just the ones in the experiments done so far. To show that, you need to show that interference would continue to be seen if the electrons were prepared in new ways not tried before. I don’t see how that is guaranteed – other than by making a circular assumption.

Comment #71 July 21st, 2017 at 1:40 pm

wolfgang #64,

Note also that if α turned out to be some easily describable number based on fundamental physics reasons, then the information content in finding additional digits would be essentially zero. And if in fact some string theory argument arose that made it really easy to empirically find the digits of α that would be a very strong reason to suspect that it has some simple representation.

Comment #72 July 21st, 2017 at 1:45 pm

Scott’s crucial idea is physically explicit in the

mises en pratiqueof theSystème international d’unités, and this same idea is mathematically explicit in the notion of anaffine space.Consider for example the thermodynamical notion of a local spatial density of a globally conserved quantity — most commonly energy, charge, and mass. Conserved densities take values in affine spaces (the physical sense is that adding a constant to a conserved density yields another, equally valid, conserved density).

One implication of this affine property is that thermodynamic potentials too are affine (for example, physically speaking, there is no natural zero of voltage). The local entropy density, regarded as a function of local densities of conserved quantities, is doubly affine, in that the entropy function is defined only up to its hessian.

Nonetheless, our physical intuitions in regard these quantities are irresistibly strong. Every physicist just “knows” that it is wrong to add an energy density to a charge density — the reason is that these two densities have different “physical” units. But it is not so easy to explain to a mathematician

whythe addition of incompatible units is mathematically wrong.Hence arises Vladimir Arnold’s celebrated complaint (in his review “Contact geometry: the geometrical method of Gibbs’s thermodynamics”, 1990):

No existing quantum textbook (that is known to me) systematically conciliates physical notions of universality and naturality with mathematical notions of universality and naturality. Yet such a conciliation definitely is feasible, and even (as it seems to me) necessary to progress in resolving the great open questions of quantum information science.

A lively account of the historical evolution of clashing notions of “universality” and “naturality” is Hasok Chang’s textbook

Inventing Temperature(Oxford University Press, 2004).The jacket includes this review

Chang’s book surveys chiefly the classical thermodynamic notion of “temperature”. How much more work, and how much further confusion, will be associated to quantum and/or relativistic and/or cosmological notions of thermodynamics in general, and temperature in particular?

Plenty of work, and plenty of confusion too, are in store for us all, that’s for sure! And great rewards too. 🙂

Comment #73 July 21st, 2017 at 1:47 pm

Radford #70: OK thanks; now I understand where you get off the train.

I guess my background assumption was that, after you’ve tried this with enough pairs of electrons, you appeal to exactly the same induction principle by which you pass from “no duck anybody has ever seen has known calculus” to “no duck knows calculus.” At any rate, your uncertainty now just comes down to the

ordinaryProblem of Induction, and not the possibility of some secret distingiushing property among the electrons you’ve seen.Or, better still: supposing that a new electron-like particle

werediscovered, which didn’t belong to the enormous equivalence class of knowably-identical particles comprising all the electrons we had seen, we simply wouldn’tcallthe new particle an “electron.” We’d give it a new name—like, I dunno, “muon,” or “tau.” 😉Comment #74 July 21st, 2017 at 2:06 pm

Scott #71: No, I think there might be secret distinguishing properties of particles that we’re currently calling electrons. People do various double-slit experiments, in which they typically prepare the electrons the same way in any one experiment, but presumably not quite the same ways in different experiments. What guarantee is there that if you took an electron prepared as in experiment A, and tried to have it interfere with an electron prepared as in experiment B, that you would get interference?

Of course, I think it’s likely that you would, since there’s no indication so far that you wouldn’t. But that’s a sort of Occam’s razor induction (not the simple induction that if you did the same experiment again, you’d likely get the same result).

Naturally, if it was found that there were actually two kinds of particles that are currently going by the name “electron”, it’s likely that one or two new names would be found, but that’s not of any physical significance.

Comment #75 July 21st, 2017 at 2:13 pm

Scott, I really enjoy this post. It does seem to illuminate some fundamental properties of physics. I also like Ted’s remark about thermodynamics #15.

Your draft proof of the entropy bound seems incomplete to me though. You conclude that if a physical object stores more information than the bound, then the object must be a black hole. So what then? What’s so wrong about black holes, and why can’t we just use a black hole to store an arbitrary amount of information in a tight space?

Comment #76 July 21st, 2017 at 4:31 pm

I have often wished this site had a Donations button; never more than today.

(I know, not gonna happen, but I can still wish.)

I am tempted to sum up the essay by saying you have shown that, according to the scientific method and science as we know it, there is no omniscient and omnipotent god. (Which is not as grandiose as it may sound, since many have reached the same conclusion albeit without as much scientific grounding.) (I know that wasn’t the main point of the essay.)

One commenter seems to be hung up on the notion that information is what he reads in a newspaper while experiencing his hard consciousness. Only insofar as that relates to the physical state of his neurons is that “information” in the sense meant here, I think.

I’ll claim that as Windows manages a PC’s inputs and outputs it also experiences a state of consciousness (one which, similar to humans’, often gets discombobulated) – a primitive one, not identical to ours, to be sure. Prove me wrong, and then I’ll worry about the hard problem of consciousness.

(Okay, it is a hard problem because of the complex interplay of emotions and logic and other things, but so is trying to understand a huge computer program without knowing the assembly language.)

Comment #77 July 21st, 2017 at 4:48 pm

Hey scott. I want to ask a more personal question of a bit of your topological logic.

are you more for extending/fixing the p-np categories.

as we have yet to see your masterwork (from a phil standpoint)

or is your complexity theory thinking about the VMA transformation. which is analog or digital?

its a simple question, but i read some of your papers.

and you are an interesting computer science.

Comment #78 July 21st, 2017 at 5:14 pm

Erik #77: Every time I thought I almost understood your question, it slipped from my grasp. For starters, what’s the “VMA transformation”? When I Google it, I get something about Ariana Grande.

Comment #79 July 21st, 2017 at 5:24 pm

JimV #76:

I have often wished this site had a Donations button; never more than today.

Happy to email you my bank info if you insist on wiring me a donation… 😉

I am tempted to sum up the essay by saying you have shown that, according to the scientific method and science as we know it, there is no omniscient and omnipotent god.

I assume a believer would respond by simply saying either that God isn’t bound by quantum mechanics and isn’t a tensor factor in Hilbert space, or—more audaciously—that knowing the quantum state of the photon already counts as “omniscience,” for example because all possible measurement outcomes will occur in the Divine Mind (a fusion of MWI and theism?). I could go either of those ways were I a believer.

Comment #80 July 21st, 2017 at 5:38 pm

jonas #75:

Your draft proof of the entropy bound seems incomplete to me though. You conclude that if a physical object stores more information than the bound, then the object must be a black hole. So what then? What’s so wrong about black holes, and why can’t we just use a black hole to store an arbitrary amount of information in a tight space?

Because the entropy of a black hole with a given radius is known and finite, by Bekenstein and Hawking’s work from the 1970s.

To spell it out in a bit more detail: by an argument very similar to the one from this post, if you try to stuff more bits into a black hole, you’ll also necessarily be adding more energy to it. But by the no-hair theorem of classical GR,

allthe energy of a black hole is reflected in its geometry, so the mass and radius of the hole (which are proportional to each other) will have to grow to accommodate the new bits. And when you do the calculation, you find that this growth is such that the holographic entropy bound is still satisfied—and in fact,precisely saturated.Another way to look at it is that, as an external observer, the only access you have to the vast majority of the bits in a black hole is via waiting a long time and scooping up its Hawking radiation. OK, but you can upper-bound the entropy in the Hawking radiation using QFT arguments. By the Second Law, this then implies an upper bound on the entropy that could have been present in the original black hole (I believe this upper bound will be loose by a constant factor, but that’s fine for present purposes).

Comment #81 July 21st, 2017 at 6:13 pm

All induction involves Occam’s Razor; induction to repeats of the same experiment doesn’t avoid it.

Comment #82 July 21st, 2017 at 7:09 pm

Sniffnoy #81: Yes, in a way even predicting that the same experiment done again will yield the same result involves some sort of Occam’s Razor, but it’s at a lower level of complexity.

The point of the argument purporting to show quantum mechanics guarantees that all electrons are the same is to avoid just appealing to “it’s simpler to assume that”. To look at Scott’s analogy about ducks, we think that there aren’t any ducks that know calculus because we haven’t encountered any such ducks. But maybe some people would like more certainty than that gives – they may try to argue theoretically that it’s impossible for any animal to (1) float, (2) fly, and (3) understand calculus – that at most you can do two of those, but not all three. If this were a valid argument, it would provide more certainty that there are no ducks that know calculus.

Similarly, if the argument that quantum mechanics guarantees that all electrons are the same were valid, it would provide more certainty that there are not really two kinds of electrons than the fact that so far there is no evidence of there being two kinds. But I don’t think it works.

Comment #83 July 21st, 2017 at 7:27 pm

Scott, I’m one of those people who thinks your greatest intellectual contribution to the world is your essay “Who Can Name the Biggest Number?”. With that in mind can you clarify what you mean when you say that “one couldn’t fit its digits inside the observable universe”?

Comment #84 July 21st, 2017 at 7:45 pm

Radford Neal #82: But you said you do agree that two

specificelectrons can be proven to be identical via an interference experiment?If so, then that’s already something that’s impossible classically, and that

a priorisounds like a logical absurdity regardless of the details of physics. It would’ve sufficed for the purposes of my post.Comment #85 July 21st, 2017 at 7:50 pm

Haribo Freak #83: Sure. Because of the dark energy, discovered in 1998, we can only receive signals from at most ~20 billion light years away or something—galaxies further from that are receding from us faster than the speed of light. (That is, assuming the dark energy corresponds to a cosmological

constantΛ, something that all observations are currently consistent with.)So then you simply apply the holographic entropy bound, as discussed in this blog post, to the region consisting of our entire observable universe. And the upper bound you get is that our observable universe can contain at most about 10

^{122}bits of information. So an integer that took 10^{125}bits to write down wouldn’t fit.For further details see this paper by Bousso.

Comment #86 July 21st, 2017 at 8:47 pm

Of course God doesn’t need to obey the laws of

Comment #87 July 21st, 2017 at 8:52 pm

Robin #89: I feel like a major theological insight just got cut off midsentence! 😀

(Though I suppose the sentence could be defended for essentially

anyway of completing it, except “math” or “logic” or equivalent.)Comment #88 July 21st, 2017 at 10:51 pm

[…] ?p=3327 by Scott Aaronson – Is information physical a contentful expression? Why ‘physics is information’ is tautological. A proposed definition. Double slit experiment. Observation in Quantum Mechanics. Information takes up a minimum amount of space. Entropy. Information has nowhere to go. […]

Comment #89 July 21st, 2017 at 11:04 pm

The first two seem very different from the third. The first two are physical principles, deduced from experiment, but thermodynamics is almost purely math. The very specific ways that we measure temperature are physical, but that’s just a detail.

Comment #90 July 21st, 2017 at 11:13 pm

Scott #85

About that 20 billion year limit…

The black hole information paradox deals with the problem of information disappearing as it falls into a black hole. But what about the information disappearing off the other end of the ever faster expanding universe? Doesn’t that create a problem for QM as well?

The proposed black hole firewall is supposed to be caused by the fact that the current Hawking’s radiation must be entangled with past Hawking’s radiation. But what happens if much of that past Hawking’s radiation has fallen off the observable universe? Does the loss of the information off the observable edge break the entanglement and cool the firewall? If not then you have a place to hide information where it cannot be detected and yet it still has an effect on black hole firewalls.

Comment #91 July 21st, 2017 at 11:14 pm

Re: identical particles

Here is an example of a property of electrons that might be discovered in the future. String theory says that we live in a universe with more than 4 dimensions. Those additional dimensions are additional parameters needed to describe the electron. The fact that all observed electrons are identical shows that they are equally smeared across those dimensions. But string theory says that it is possible to prepare electrons to be distinct in those dimensions.

Comment #92 July 22nd, 2017 at 1:40 am

As a non-physicist, non-mathematician, amateur philosopher of mind, I want to thank you, Scott, for providing an explanation that I could sorta follow. I just wanted to provide a couple of observations to whom it may concern.

I see two different “kinds” of information referenced in the discussions. The first is the “Information Theory” kind, which is essentially concerned with quantity of information without regard to what the information is about. This seems to be what was referenced in your original post.

In #15, Ted introduces a different “kind” (actually, a subset of the first kind), and that is information with specific meaning or aboutness. When this information is organized for a purpose, say, for the purpose of extracting work, we can call this information “knowledge”.

It is my contention that the fundamental basis of consciousness is the “processing” of information, which takes the form:

Input -> [agent] -> output,

wherein the input is a finite set of physically measurable data (so, information) which when presented to the “agent” generates the output while remaining approx. unchanged and capable of repeating the process.

In this model, there are two places where information has physical importance. The first is the input. While the specific data points included in the set are usually a subset of all the possible data points available in the system being presented to the agent (thus representing a poetically natural level of abstraction, ahem), your discussion provides (I think) the size of the maximal possible set of data via the Bekenstein bound.

The second place where information is physically important is internal to the agent. This is what I call knowledge. In order for the agent to process the input information in a valuable way (like extracting work), the agent must have a physical form which includes or is derived from information “intended” for a “purpose”.

Just my observations. Thanks again for the post.

*

Comment #93 July 22nd, 2017 at 2:14 am

Douglas #89: For thermodynamics, I guess the main inputs from

fundamentalphysics are these.(1) The microscopic laws are time-reversible.

(2) The dynamics are rich enough that the system you care about can be assumed to have already equilibrated.

(3) Energy is conserved (so, your system should be modeled as having filled out all the microstates at a given energy).

(4) A crucial one, which I wasn’t explicit about in the main post: The system you care about has a finite and nonnegative

specific heat.Assumption (4) means that, as we add energy to the system, at least some of that energy is going to go into making the molecules move around faster, rather than into other degrees of freedom like rotations and vibrations.

(So, in particular, we’re assuming the truth of the atomic theory of matter, as well as the kinetic theory of heat!)

Assumption (4) will typically be true in practice, simply because of equipartition, and because a given molecule only has finitely many degrees of freedom available to it, and translational motion is one of them (or three of them, actually, in a 3-dimensional universe).

But there can be cases where assumption (4) is false. For example, a system undergoing a phase transition (e.g., boiling water) temporarily has infinite specific heat: adding more energy doesn’t change its temperature at all. Weirder still, a black hole has negative specific heat: adding more energy makes it

colder.Anyway, though, with assumptions (1)-(4) together, we can deduce that as you pump more bits of information into your system, you’ll necessarily raise its temperature.

In summary, I also like to imagine thermodynamics as basically just applied information and probability theory—i.e., stuff that can be logically deduced with almost zero physical input. But if you want to connect entropy to the concepts of

energy,temperature, andheat, you do need at least the physical input above: physics is reversible, energy is conserved, equilibrium is reached, temperature measures average molecular speed, and molecular motion is one among only finitely many entropy sinks available (i.e., specific heat is finite).Comment #94 July 22nd, 2017 at 7:04 am

Didn’t Feynman and Wheeler toyed with the idea that all electrons are the same because there’s actually only one single electron in the entire universe (looping around in dpace time), but it fell apart because it wouldn’t wxplain the discrepancy between the amount of electrons and positrons we see (a positron is an electron moving backwards in time)

Comment #95 July 22nd, 2017 at 8:17 am

Scott,

You cover a lot of territory here, but in an impressively detailed way.

Here are my notings on the topic, some of which, I am afraid, may perhaps be seen as somewhat vacuous—even if in my honest opinion, they are not. (BTW, I here also allude to Sean Carroll’s and similar responses, but in a rather indirect way.)

I will restrict myself to the following scope: taking a thermodynamic perspective on the physical universe, then separately taking an information-theoretic perspective on the same, and then relating the two.

Thermodynamics basically begins by focusing attention on a finite part of the physical universe called a “system” (whether it’s defined as a control volume or as a control “mass”), studying the interaction of the system with the rest of the universe called “surroundings” using some suitable physical quantities of interest (e.g. pressure, volume, temperature, work, energy, heat, entropy, etc.), and then abstracting the system-and-the surroundings to cover greater and greater parts of the universe until the limiting case that we obtain the isolated system that covers the entire universe.

Information theory basically begins by focusing on the workings of a man-made object (or a “machine”) that is the computer, and characterizing its workings and capabilities through the abstraction of information.

To relate the two perspectives, you have to see the universe as a computer; the vice-versa is not permissible for the simple reason that any kind of a man-made object always is just a part of the universe taken as a whole. There is another aspect worth noting: there is no counterpart to the ideas of surroundings in the information-theoretic perspective. The information theoretic computer already has been assumed to be the equivalent of what would be called an isolated system in thermodynamics.

Clearly, then, an information theoretic perspective, even if cast on the physical universe as a whole, must necessarily remain a more abstract way of looking at the universe than is the thermodynamic perspective.

Actually, this is a matter of two concepts subsuming the same physical facts, but from two different conceptual perspectives—with one of the perspectives (the physical or the thermodynamical) being more complete and more fundamental than the other (the information-theoretic).

Having said that, and thinking about the relation between the two perspectives, the idea I now come to toy around is the following:

The concept “information” in the IT perspective plays the same role as the concept “state” does in the thermodynamic (or more broadly, physical) perspective.

A state is nothing but a

minimalset of mutuallyindependentstate variables thatcompletelydescribe the state of the system. (Yes, there is a repetition here, because the concept of state is at the most fundamental level, viz. that of postulates.)When a state of a system changes, so does its total information content, and vice versa: whenever the total information content of a physically existing system can at all be said as having changed, there must be a corresponding change in its physical state.

Still, for the above-mentioned reasons, the concept of state is more “primitive” (i.e. fundamental) as compared to the concept of information. A computer is always supposed to be a self-sufficient description; there is no counter-part to surroundings in it.

My two cents.

Best,

–Ajit

Comment #96 July 22nd, 2017 at 8:20 am

One more thing (I refer to my comment just submitted above).

We must respect the scope of the information theory. In the broadest scope, it also subsumes the analog computer, and I am not very clear if a discussion couched in terms of bits would cover the entire scope or not—it seems restricted to the digital computer alone.

Best,

–Ajit

Comment #97 July 22nd, 2017 at 8:34 am

A student-friendly follow-on reading to Douglas #89 and Scott #93 is the recent free-as-in-freedom article by Marius Krumm, Howard Barnum, Jonathan Barrett, and Markus Müller, titled “Thermodynamics and the structure of quantum theory” (

IOP, 2017).Along lines broadly similar to Scott’s #93, these authors argue the following four traits (here brutally condensed) are sufficient for thermodynamic entropy to be well-defined:

(1) mixed states are indistinguishable from ensembles of pure states,

(2) measurements can be implemented as semi-permeable membranes

(3) states can be reversibly transformed into one another

(4) entropy is a continuous function of state parameters

What’s directly relevant to this

Shtetl Optimizeddiscussion is that thermodynamic entropy is derived from these four traits by a proof-strategy that explicitly demonstrates that “information is physical”, in the sense that the above four traits suffice to physically realize a von Neumann “ω-gas”.In a nutshell, by appropriate and explicitly physical Carnot-cycle manipulations of the ω-gas, the von Neumann/Shannon expression for the quantum informatic entropy is derived from the classical ideal-gas laws, and the classical entropy of an ideal gas. Thus we extract (entropic) “bit” from (physical) “it”.

This “information is physical” ω-gas construction of the general expression for quantum entropy first appeared in von Neumann’s 1935

Mathematische Grundlagen der Quantenmechanik. What’s nice about the Krummet al.article is the discussion they provide that relates von Neumann’s reasoning to subsequent work in information theory, general relativity, etc. Hence the details of von Neumann’s ω-gas theory are well-worth study by students of quantum entropy.———

On a related note:

It’s fun to consider the evidence that Scott’s thesus may be, not merely true, but even a Great Truth!

Applying Bohr’s maxim to Scott’s thesis generates its Great Truth dual:

A crucial question associated to the validity and utility of this dual Great Truth is whether thermodynamics “just works” on low-dimension varietal state-spaces. Here the notoriously challenging high-energy infinities and singularities of quantum field theory and general relativity are absent; in their place appear the algebraic singularities that are so notoriously challenging in algebraic geometry and dynamics.

Hence, mathematically at least, there’s a pretty reasonable balance-of-fascination between quantum supremacy and quantum skepticism

Another indication that quantum supremacism and quantum skepticism are Great Truth duals, is that the notion of “entropy” is comparably natural, and comparably challenging, in quantum field theory and in algebraic geometry and dynamics.

Perhaps we can look forward to a natural conciliation of quantum supremacism and quantum skepticism. To borrow a phrase from Hemingway “Isn’t it pretty to think so?”

Comment #98 July 22nd, 2017 at 9:57 am

I have never considered the argument that a god can just “conjure up” new fundamental fields with 10^300 new degrees of freedom and violate the holographic entropy bound. However, now that you represent that argument, I am not convinced that use succeeded in refuting it. After all, imagine if there were a very large number of low-mass scalar fields ϕ_0, ϕ_1,…, ϕ_(2^(10^500)). Then in any small region of space it possible to put just one of these particles ϕ_i, and this particle would hold 10^500 bits of information from the value of i. None of the other fields would contribute any energy, since their corresponding particle is not present and their vacuum energy is normalized to zero. The most plausible answer that I could think is that although the average energy of each of these fields is zero, each has a spatially varying energy distribution which interacts gravitationally with one another and somehow prevent this situation from being possible. But that would imply that there is an upper bound on the number of fundamental scalar fields a theory could have. What is this bound, and why haven’t I heard of it before?

Also, since Sean Carroll was comparing information with energy as things we consider to be physical even though they aren’t directly present in the laws of physics, I want to present another example: Color. The story scientists tell laypeople is that light is the electromagnetic field varying in a sort of sine wave, and color comes from its wavelength. But except for rare situations such as light emitted from a laser, the electromagnetic field doesn’t actually look like a sine wave but varies in a jumbled mess. However, the Fourier transform is such a useful tool for calculating the behavior of a linear translationally-invariant system that we reify the Fourier components of the electromagnetic field as “light of a given wavelength” even this cannot be directly seen from the electromagnetic field and has to calculated mathematically from it.

Comment #99 July 22nd, 2017 at 11:07 am

Itai #98: Thanks! I recall that an upper bound on the total number of particle species is

preciselyone of the things that people worry about in this context. But I can’t remember what the resolution of the issue was. If anyone could remind / enlighten us, I’ll be happy to update the post to include that.Comment #100 July 22nd, 2017 at 11:48 am

Since ‘information’ is physical, and since “casting processes in terms of information makes certain physical processes much more clear” (S.Carroll), why is ‘information’ a dimensionless quantity? Should we create a “new” dimension?

Comment #101 July 22nd, 2017 at 12:18 pm

Could we say that the study of “information” is the study of physical processes which have the characteristic that they produce certain observables but could have produced a larger set of possible observables? Scott says that “‘information’ is just a name we give to whatever picks out one element from a set of possibilities,” but isn’t that ‘whatever’ always some kind of physical process?

If we characterize the study of information this way, then it becomes a subset of physics which studies physical processes as abstract categories, without worrying so much about the mechanism that make them work (allowing comparison of similarities of ‘information’ transmitted by air or wire by focus on certain properties of those physical processes). You could then also recharacterize the study of computation in the same way.

I think one utility that comes from thinking about it this way is that statements in computation or information theory becomes statements about physical processes that share those properties. You could also say that the ‘whatever’ is simply anything which is both causal and probabilistic, and because physical processes are causal and probabilistic we can talk about information being physical, but maybe it doesn’t really give us any new understand of physics or information.

Comment #102 July 22nd, 2017 at 3:27 pm

I’m a little confused by step 8.

> In general relativity, anything that carries energy couples to the gravitational field.

Is it possible that different things couple to the gravitational field by different amounts? Say the curvature is given, not by the stress-energy tensor times Newton’s constant, but by that plus some small correction?

I suspect the answer is no, but I don’t have a complete understanding of why. The reason is that, in Newtonian gravity, it is possible for different substances to have different gravitational strength (because the same equations describe the electrical force), but this requires them to also be affected different amounts by gravity (by Newton’s third law). In general relativity, everything is affected the same amount because it is just the curvature of spacetime. But what is the analogue of Newton’s third law for general relativity?

The answer can’t be too simple, as the mathematical structure of the theory, if it specifies that every object pulls on spacetime the same amount, does not specify what this amount is – it’s an empirically measured parameter.

Comment #103 July 22nd, 2017 at 4:25 pm

Scott: Something seems to have gone wrong at the beginning of this post?

Comment #104 July 22nd, 2017 at 5:37 pm

Sniffnoy #103: Thanks!! Fixed.

Comment #105 July 22nd, 2017 at 7:22 pm

Existing experiments and existing quantum field theory, agree that the answer is “no” to very high precision. For details, see (for example) Matthew Francis’

Novaarticle “Does Antimatter Fall Up or Down?” (2014).On the other hand, it is perfectly reasonable to imagine that in the next few years, in-progress experiments may unequivocally show that the trajectories of trapped neutral antihydrogen molecules (for example) depart from Newtonian predictions.

And suppose too, that in the next two years, advLIGO unequivocally observes gravitational radiation, from some distant cosmic source, arriving simultaneously with optical radiation (implying that photons and gravitons alike are zero-mass particles).

These two (hypothetical) near-term experimental observations, considered jointly, would directly conflict with a general prediction of quantum field theory — a prediction due to Feynman — that the sole consistent field theory of massless spin-2 particles (gravitons) has general relativity as its classical limit. For the theoretical details, see one of Feynman’s lesser-known collections,

Lectures on Gravitation(1971).In a nutshell, if trapped antimolecules are observed to fall “up”, or to depart in any observable way from the predictions of classical Newtonian gravity, then our present understanding of the low-energy, small-coupling limit of gravitational quantum field theory is entirely wrong. This would be exciting, needless to say! 🙂

Comment #106 July 22nd, 2017 at 9:42 pm

Sean–

Two follow-ups to your response about energy in GR not being necessary.

First, would you say that electric charge is just as arbitrary? If mass-energy is not a necessary concept for GR, then electric charge isn’t a necessary concept for electromagnetism, as they play analogous roles as sources for both theories.

You may well say that’s fine — that charge is likewise just a useful concept but not a fundamental part of physics. And it’s true that we can write T_mu,nu on the right-hand side of the Einstein field equation directly in terms of fields.

But the special combination of fields and their derivatives is just the energy of the fields, whatever we choose to call it.

This is very different from the way that energy is used in Newtonian mechanics. We never have to write down (1/2)mv^2 + V(x) for a Newtonian system if we don’t wish to. We can always just use Newton’s second law.

But in GR, is there really a way to proceed without using the particular combination (1/2)mv^2 + V(x) (or, more properly, its relativistic generalizations for particles and fields)? And if we have to use that special combination, then aren’t we using energy in a fundamental way whether we decide to use the name “energy” for it or not?

Comment #107 July 22nd, 2017 at 10:19 pm

> But now suppose you tried to measure the photon, to check whether God had told you the truth. In that case, you’d again find the photon going through the left slit or the right slit with equal probabilities!

As I understand it, in Bohmian mechanics, which slit the particle goes through is deterministic – set, if you like, by the initial configuration of particles in the universe. What would happen is on an individual run, is that you would always see the particle coming out of the slit written on the piece of paper. Each run you are handed a different piece of paper. Even though on each run the paper would be correct, if you averaged over many runs, would you would see an equal probability of left and right slits, matching Born’s rule.

More importantly (in principle) you could have that slit of paper, and still see the interference pattern. If you simulate BM on a computer, for example, you have that slip of paper and can follow the individual trajectory and produce the same probability distributions as regular QM.

> Instead, we can simply start from the assumption that there’s any other degree of freedom, anywhere else in the universe, that’s entangled with the photon’s position—how it became entangled, we don’t know or care—and then calculate the photon’s reduced density matrix, and we’ll see that the interference pattern is gone.

While I (obviously) totally agree that entangled systems are experimentally distinguishable from those in a pure state, I worry that even talking about systems being entangled contains the assumption of both being physical systems. As I understand it, the idea like idealism, or of God, is that they’re “deeper”- more akin to the non-local hidden variables in BM, than to a large physical object out there waiting to be discovered in state space.

Comment #108 July 22nd, 2017 at 11:36 pm

Scott # 39

The argument certainly has gone off the rails by step 5, or in the inference from step 5 to step 7. The spatial variation in a field in a given Lorentz frame gives you no clue to its temporal variation in that frame: it is not as if these quantities are connected by c. Suppose you know that there is a complex scalar field (single particle spineless wave function) spatially periodic with wavelength lambda in a Lorentz frame. You want to know its temporal frequency in that frame in order to figure out its energy. Well, you are out of luck: c alone does not hold the answer. This is obvious for a particle with momentum zero in that frame: lambda is infinite and the phase is constant. But your zero-momentum particle can have any energy you like: it depends on the rest mass (via E = mC^2). So this scheme for converting spatial variation into energy, as stated, fails. The rest of the proof collapses.

Comment #109 July 23rd, 2017 at 1:25 am

So there’s one thing that strikes me as really off here, and that’s that the parts about temperature keep talking about molecules moving around. But the thermodynamic notion of temperature is rather different. How does the argument change when that is used instead? I suspect the assumptions required for it become a bit different.

Comment #110 July 23rd, 2017 at 3:43 am

antastic post. Thanks. Scott, a question, one thing I wasn’t clear about. You speak of your “inner Platonist” and how “the physical world we observe comprises only a tiny sliver of mathematical possibility-space”. Yet according to Landauer – if I’ve understood him correctly – mathematics itself is meaningful _only_ if it is the product of real computational processes. Are you suggesting that we should believe in the existence of two radically different kinds of information, i.e. physical information that takes up a minimum amount of space and also disembodied mathematical information that exists in abstract mathematical space? Or am I interpreting your “inner Platonist” too literal-mindedly? (if so, apologies!)

Comment #111 July 23rd, 2017 at 8:21 am

David Pearce #110: Well, one kind of information is a subset of the other kind. I would say this:

Firstly, it

certainlymakes sense to discuss “information” within a purely mathematical context. E.g., if you flip a coin 10^{50000}times, each flip having independent heads-probability p, there’s a clear meaning to saying that the outcomes contain H(p) 10^{50000}bits of information. And that’s completely independent of physics, in the same sense as any math.But secondly,

onekind of information that’s particularly interesting to talk about, is the kind that’s contained in our mathematical description of the state of the physical world. A priori, one might think that some of that information has observable effects, and some of it doesn’t, and the amount that doesn’t could just be added to arbitrarily.The surprising thing is that, because of specific features of our best physical theories (GR, QM, QFT, …), it’s actually pretty hard to add information into our mathematical description of the state of the physical world, without it having specific observable effects. And that’s what I proposed we should mean by saying “information is physical.”

Comment #112 July 23rd, 2017 at 9:58 am

@ Tim #108, Scott asked me to respond. Both parts of your comment are wrong:

1) The equation of motion for a relativistic scalar field with mass m (for simplicity assuming it is free) is

(d/dt)^2 phi=c^2 grad^2 phi-(m^2 c^4)/hbar^2 phi

Obviously the time derivative on the left hand side is changed (in a way that involves c) if we change the spatial gradient on the righthand side. In Fourier space, phi=e^{ikx-i omega t} is a solution only if omega=sqrt{k^2c^2+m^2c^4 hbar^{-2}}. So changing the position dependence (k) changes the time dependence (omega).

2) In your discussion of the particle, you forgot about the uncertainty principle. The location of a particle with zero momentum is maximally uncertain.

If you are confused about why quantum mechanics is important in debunking argument 2) but not argument 1), it is because the relationship between fields (in argument 1) and particles (in argument 2) involves hbar.

Comment #113 July 23rd, 2017 at 12:07 pm

Daniel,

Sorry, but that’s not right. The suggestion was that from the spatial variation of a field (in a Lorentz frame) alone one could somehow determine its temporal variation (and hence energy), and the example of the zero momentum eigenstate is a straight refutation. Of course the position of a particle with zero momentum is maximally uncertain. I didn’t forget that: it is just neither here nor there. There is still a definite energy associated with the particle, and you can’t tell what it is by its spatial variation, because there isn’t any. What you are forgetting is that a boost is implemented by a Lorentz transformation, not by “changing the spatial gradient”, as it would be in a Galilean transformation. Of course, if you already know what the mass m of the particle is, then you know about its rest energy via E = mc^2. But you didn’t get that information from the spatial variation.

Comment #114 July 23rd, 2017 at 2:25 pm

The claim is that localizing information costs energy, not that there is zero energy if it isn’t localized. In quantum field theory we can always change the zero of energy by adjusting the cosmological constant. Without gravity it is only energy differences which are physical.

Comment #115 July 23rd, 2017 at 5:24 pm

Let’s start with the assumption that space/time is emergent and that entanglement is somehow responsible. This brings up the question of “why is every electron exactly the same. As a general rule we could say that two objects that are in every sense identical, separated in space, have separate identities but two objects that are in every sense identical, separated in time, are usually considered to have the same identity. It should be obvious to anyone paying attention that information and meaning are not exactly the same thing. Information may appear in a digital form but meaning never does.

Comment #116 July 23rd, 2017 at 5:57 pm

Daniel,

If that is a reply to me, I can’t see its relevance to my remarks.

Comment #117 July 23rd, 2017 at 6:06 pm

Scott, is this an ok place for stupid complexity questions unrelated to the main post? I’m wondering if a language can be in P but not in NP. Example:

Let L be the language “there exists a pair of twin primes greater than N” where N is in binary. L is at worst linear-time since if the twin prime conjecture is true, L is always “yes”; while if it’s false, L is “yes” up to some specific M and “no” after that, so there’s an algorithm that decides L for any N by comparing N with M (we can’t identify the specific algorithm because we don’t know M, but it exists, even if its existence is not provable).

On the other hand, given N, there might be primes (p,p+2) greater than N and their primality can be proved with the AKS algorithm, but the minimal such pair might be exp(exp(N)) or something like that, so L is conceivably not in NP.

Am I messing up some definitions and/or missing something dumb? Thanks.

Comment #118 July 23rd, 2017 at 6:19 pm

asdf #117: No, P is a subset of NP, and your L in particular is in NP. Your mistake is to imagine that the witness has to consist of a twin prime pair. It doesn’t: in fact, for your L, the witness can be the empty string. Then the NP “verifier” just ignores the witness, and does whatever the P algorithm does.

Comment #119 July 23rd, 2017 at 6:46 pm

I thought the NP verifier had to actually prove that the instance was in L? Say TPC is true, so the instance is always in L. TPC might be unprovable–then what?

Comment #120 July 23rd, 2017 at 6:50 pm

Edit: s/”actually prove”/”examine a proof”/, sorry.

Comment #121 July 23rd, 2017 at 6:55 pm

Sorry about all these followups, I wish there was a way to edit posts. I thought that NP was supposed to be a subset of EXP. It could be that TPC is true, yet there is not even an EXPTIME algorithm for proving membership of a given N in L. I’m really missing something here, if L is supposed to be in NP.

Comment #122 July 23rd, 2017 at 7:09 pm

asdf: The completeness/soundness requirement says that if an input is in L, then there must exist a witness causing the verifier to accept, while if the input is not in L, then no witness must cause the verifier to accept. That’s it. That’s the entire defining property of NP. There’s no requirement that there exist a meta-proof that the verifier behaves the way you want it to—exactly like, in order for a language to be in P, there merely has to exist a polynomial-time algorithm; there’s no requirement that there exist a proof that the algorithm is correct (or that it runs in polynomial time).

Or to say it another way: it would be perfectly sensible to say that a language was in NP, but not provably in NP.

Comment #123 July 23rd, 2017 at 7:28 pm

Aha, thanks.

Comment #124 July 24th, 2017 at 12:23 am

Tim #113, Daniel #114, and Scott (in general):

Tim #113 says:

>> “The suggestion was that from the spatial variation of a field (in a Lorentz frame) alone one could somehow determine its temporal variation (and hence energy), and the example of the zero momentum eigenstate is a straight refutation.”

Very well put.

I think it’s possible to think about the basic issue involved using simpler terms too, viz., those of classical mechanics.

Suppose that there is a static pattern of a spatial wave on a patch of ground, say as of the sand on a beach or in a desert. Since the pattern is static, there is no kinetic energy associated with this variation. For convenience, also assign a zero potential energy to this, initial, state of the system.

Assume that the system description is complete right here, and thus, that there is no interaction it has with the rest of the universe. It’s then obvious that no matter how much time elapses, the system state would remain the same; the spatial variation has produced no variation with time in this case.

Now let’s look again at what Scott actually says in his point no. 2:

>> “Anything in the physical world that varies in space—say, a field that encodes different bits of information at different locations—also varies in time, from the perspective of an observer who moves through the field at a constant speed.”

That’s right, but notice that in saying so, he has indirectly changed the very definition of the system itself. The system description now also includes an observer that is moving (in a frame fixed to the earth). To introduce this new element of a moving observer

isto introduce a new, non-zero, net energy into the system description. It’s this energy which may then perhaps be viewed via another, suitable, frame and captured using a suitable relation (valid in that frame) in which a time-variation implies a net non-zero energy. The non-zero energy of the system still has not arisen because of the static waveform (static, in the earth-frame); it is because of the interaction of the moving observer with the field (which is static in the earth-frame).—

Another way to approach this whole issue is via the usual X(x)T(t) ansatz employed in the separation of variables. Just because X(x) varies in space (say it’s a simple sinusoidal) does not mean anything about the T(t) variation. T(t) need even be an harmonic oscillation; it’s free to be any arbitrary function. … It’s just that if not a harmonic oscillation, it would not be solving a wave equation. OK, so what? … Even if you think of applying Fourier, the fact remains that the specific form of T(t) is still arbitrary—and completely independent of X(x).

As always, feel absolutely free to correct me if I am going wrong anywhere in the above.

—

Daniel #114 says:

>> “The claim is that localizing information costs energy…”

That is the real meat of the story here. And it’s a wonderful way to try and forge links between the physical and the information-theoretic viewpoints. That, plus Scott’s ready working out of an estimate of the upper bound with a due consideration to the black-hole formation and all was a marvelous piece of insight. Kudos to him for that, despite that slip-up in #2 and all!

—

Best,

–Ajit

Comment #125 July 24th, 2017 at 12:32 am

Dear Scott

Very inspiring! Even for me who is on the opposite side, namely that Physics is made of pure info–quantum info, to be precise–not viceversa. I will make a post soon on a very strong motivation for this.

Here, I only want to point out an apparent mistake that seems to invalidate the whole reported argument of Daniel Harlow.

Take a state of the field that is constant in time, namely is “stationary”. This is necessarily an eigenstate of the Hamiltonian and of the total momentum. Now, for a boosted observer what change are just the values of energy and momentum, still the state remains stationary. Therefore, the observer motion doesn’t generate time change!

Compliments again!

Mauro

Comment #126 July 24th, 2017 at 1:56 am

I think part of my confusion above was the mistaken idea that Levin universal search (LUS) was a single algorithm that automatically runs in P-time given the specification of a problem that’s in P. In fact LUS is a family of algorithms indexed by witness verifiers. So while there’s an LUS instance (or an algorithm in general) that recognizes L, it might be impossible to explicitly find such an instance or algorithm. Cool.

Comment #127 July 24th, 2017 at 2:10 am

One more good example is the paradoxes associated with “Maxwell demon” in classical thermodynamics, which are resolved when you assume that information necessarily has a physical cost.

Incindentally, your argument can be made shorter since momentum couples directly to gravity. Maybe less famously so, but directly nonetheless. It might be regarded as consequence of special relativity if you think about the coupling to energy as more fundamental, but I see no (non-sociological) reason to think that way, at the level of the equations they play an exactly identical roles.

And I see that someone already mentioned the species problem. Also, in quantum gravity spacetime is necessarily an approximate notion so no statement making direct reference to spacetime can be regarded as fundamental. Maybe the best way to view entropy bounds is as stating the necessary amount of information necessary to create an approximate notion of spacetime, in the circumstances when this is the appropriate description.

Comment #128 July 24th, 2017 at 7:14 am

Charles #107, Scott #20: You seem to be a bit confused about how Bohmian mechanics work.

It is true that in Bohmian mechanics you can both know the trajectory of the particle – by looking at the slip of paper from the gods – and observe interference.

What is not true is that Bohmian mechanics still reproduce quantum mechanics when you do have access to the particle positions. The whole theory crucially depends on you being ignorant about that, and on your ignorance being described by the probability distribution $\(|\psi(x)|^2\)$. If you break this assumption, you get all sorts of fireworks, for example faster-than-light signalling.

Losing this information-interference tradeoff is the least of your problems.

Comment #129 July 24th, 2017 at 7:23 am

Thanks for clarifying Scott #111. You’re probably right. But my inner nominalist rebels at the “certainly”. Yes, it’s hugely useful to assume fictionalism is false (cf. Hartry Field’s “Science Without Numbers”). Likewise, it’s hugely useful to assume the falsity of e.g. ultrafinitism. Yet do we know this is the case, or are we just expressing our deeply felt intuitions? A nominalist might say that simply helping oneself to an ontology of abstract objects is the kind of sloppy thinking only a mathematician could love. Might “all information is physical” literally be true and exhaust reality – including the physical minds of mathematicians?

I don’t know. Anyhow, fascinating post, thanks again.

Comment #130 July 24th, 2017 at 8:37 am

Giacomo #125: Is your objection the same as that of Tim #108? If so, what do you make of Daniel’s response?

Incidentally, it occurs to me that, even if we completely abandoned the part of the argument dealing with the relation between space and time derivatives (steps 1-4), we would still be left with a strong conclusion about the physicality of information. Namely, that the number of

computational stepswould be limited to about one per Planck time if you don’t want to create a black hole.Now a priori, it would seem strange to me if a Bekenstein-type upper bound on the number of steps were valid, even though the “corresponding” bound on the number of stored bits were invalid…

Comment #131 July 24th, 2017 at 8:53 am

David #129: The reason I’m certain about this is that being able to talk about the positive integers—the whole countable infinity of them—is a conceptual prerequisite to being able to talk about things like cosmological bounds on the amount of information in the physical world.

Thus: physicists now say that our causal patch of de Sitter space is limited to ~10

^{122}qubits. But in making such a statement, it’s already implicit that, had the experimental data been a bit different—or, say, if it’s discovered tomorrow that the cosmological constant is going to decay to a smaller positive value—the estimate might need to be revised a bit upwards, let’s say to 10^{126}qubits. But if you agree that it could be revised a little bit up, then why not a lot up? But by then you’ve ceded to me that arbitrarily large integers are meaningful to talk about, prior to making any cosmological observations.Comment #132 July 24th, 2017 at 9:07 am

There exists a letter from John von Neumann to Marston Morse, dated March 2, 1955, that anticipates the several themes of this

Shtetl Optimizeddiscussion.In his letter, von Neumann recommends that Marston nominate as the AMS Josiah Willard Gibbs lecturer either John von Neumann or Julian Schwinger for 1956, and Stanislaw Ulam for 1957, on the grounds that (in von Neumann’s words):

Now it is 62 years later “these matters are not resolved”, in at least two senses that are central to the discussion here on

Shtetl Optimized.First, we still have no quantum field theory that both mathematically rigorous (by von Neumann’s standards) and generally relativistic. Second, ongoing and even accelerating advances in “random path” methods — which nowadays include both large-scale classical molecular simulation and large-scale quantum unravellings — have to date been sufficiently vigorous, as to preserve the pragmatic viability of the Extended Church-Turing Thesis, against all the ingenious assaults of Quantum Supremacists.

Extending von Neumann’s reasoning in the light of these two subsequent developments, it is natural to postulate that in our universe, Nature requires that quantum Hamiltonians originate in gauge field theories that are sufficiently informatically lossy, as to ensure that the Extended Church-Turing Thesis is true.

In this postulated ECT-compatible universe, the low-dimension tensor network state-spaces that are proving to be so strikingly and unexpectedly effective in today’s computational quantum simulations — and effective too in AI and machine learning — are the fundamental state-spaces of Nature.

In a nutshell, the extended Church-Turing thesis is true, such that the ambitions of quantum supremacists are unachievable, because Nature’s gauge-mediated quantum trajectories unravel on low-dimension algebraic varieties.

Here we have a 21st century quantum postulate that is mathematically well-posed and physically well-grounded, for which in von Neumann’s 1955 phrase “either alternative would be interesting and significant.”

In its practical and even strategic ramifications, this postulate satisfies too, a criterion that von Neumann expressed in a letter to Robert Oppenheimer, (dated February 19, 1948)

PSConsonant with the Gibbs lecture tradition of “contact with the strivings and problems of the world that surrounds us”, the AMS Gibbs lecturer for 2018 will be a wonderfully appropriate choice — as it seems to me anyway — the complexity, cryptography, and machine learning theorist Cynthia Dwork! 🙂Comment #133 July 24th, 2017 at 9:33 am

Mateus #128: The question is how you modify Bohmian mechanics to the hypothetical situation in which the particle positions are given to you on a slip of paper from the gods.

Your assumption is that the particles continue following the same guiding potential as before, and therefore you can deterministically predict where the particles will be in the future, and then make a measurement that deterministically confirms it. The trouble with that view is that, merely by making a lucky guess about something that normally doesn’t enter into physical predictions at all (namely, Bohmian particle positions), you could then observe a clear violation of quantum mechanics.

My alternative view is that, since we already constructed Bohmian mechanics in the first place to reproduce all the predictions of QM, we should continue to do so even in the hypothetical scenario with the slip of paper from the gods. In that case, a Bohmian would presumably say: sure you can know both the quantum state of the system and the Bohmian particle positions,

as long as the state evolves in isolation. But as soon as you make a measurement, the entanglement between the system and your measuring apparatus (whose hidden variable values youdon’tknow) has the effect of re-randomizing the system’s hidden variables, and making them distributed according to the Born probabilities after all. In other words, we’d take it as part of thedefinitionof a hidden-variable theory that the variables are actually “hidden”: that only the wavefunction can enter into the calculation of probabilities.The one thing I’m still unclear about is this: can the latter picture be derived purely within Bohmian mechanics, by analyzing what happens when the system becomes entangled with the measuring apparatus, and the two sets of hidden-variable values interact with each other? Or does it require modifying Bohm’s guiding equation?

Comment #134 July 24th, 2017 at 10:55 am

Scott,

A great post! I especially liked how it cannot be that there are Emily and Ernie electrons and we don’t know that. Of course, I knew about identical particles, but then I did not know, until I read this – you know what I mean. Thank you!!

(However I think you ARE in fact saying “any information that has observable physical consequences, has observable physical consequences”, at a couple of places-

“And conversely, if a degree of freedom doesn’t interact with the stuff we’re observing—or with anything that interacts with the stuff we’re observing, etc.—well then, who cares about it anyway?”

and

“If you like, such a field would at most be a comment in the source code of the universe: it could be as long as the Great Programmer wanted it to be, but would have no observable effect on those of us living inside the program’s execution.”

I have not read all the comments – probably you have explained them in one of them.)

Comment #135 July 24th, 2017 at 11:58 am

Scott #133: I’d like to point out that the approach I presented is not “my” assumption, but what the Bohmians themselves do: keep the dynamical equations as they are, and see what happens when you change the distribution of hidden variables. It is called non-equilibrium Bohmian mechanics.

Moreover, I don’t think it makes sense to keep reproducing quantum mechanics when given the slip of paper from the gods: the whole point of Bohmian mechanics is that in this case evolution is deterministic, in direct contradiction with quantum mechanics.

The theory you are describing is not what happens when you get the slip of paper from the gods, but what happens when you yourself make a measurement to find out what the hidden variable is. Then of course you are going to get entanglement between system and measurement apparatus, and the usual information-interference trade-off.

Comment #136 July 24th, 2017 at 1:40 pm

Scott, this post rocks! It really includes a synthesis of some of the most important progress in conceptual physics that is occurring in physics as we speak (listen). I concur pretty much totally with your analysis and how ads/cft, though it wasn’t particular emphasized here, really brings together what’s going on with the finite information content that a particular surface content of space can hold.

One thing that I wish this understanding would bring about with physicists is a re-analysis of why the current cosmological constant is so small; that is, why it’s now so small relative to the relative to the energy density of space at or near the time of the big bang. It seems to me, as with many other people thinking about this, that it’s a big mistake to think that the acceleration of the universe was only occurring right after the initialization of the big bang and then just stopped until recently. Why in the world do physicist assume that!

Isn’t it much better to assume that the early universe was a hot gas of kinetic energy that has, and has always continued, to accelerate outward at an ever slowing rate from that moment till today. There is no reason to assume space every expanded faster than C. Doing that throws the baby out with the bath water. The dark matter showing up everywhere, at least in this not so humble opinion, manifestation of the relativistic effect on the limited information content of a given chunk of space.

In this way as the universe accelerates outward mass is generated at ever larger scales using the building blocks of mass created at earlier epochs in the expansion. As this mass builds up naturally naturally the acceleration would slow. It seems to me that people with a lot more math ability than me should work on this. I think scientists have been jumping to conclusions when they concluded there was faster than light inflation. I think entanglement of information relativistically can account for the homogeneity and isotropic nature of space.

Comment #137 July 24th, 2017 at 2:00 pm

Mateus #135: I’m probably less interested in what the Bohmians do than in what they ought to do! 🙂

But yes, I’m aware of non-equilibrium Bohmian mechanics; I had conversations about it with Antony Valentini when I was at Perimeter ~13 years ago. I apologize for not realizing right away that a Bohmian would probably just consider the scenario with a slip of paper from the gods to be an instance of non-equilibrium BM.

On reflection, though, there’s a technical question here that I don’t know the answer to, and you haven’t answered either. Namely, if I measure a non-equilibrium state, does the interaction with a macroscopic measuring device (assuming the latter is in equilibrium) essentially always force it back to equilibrium anyway? If so, then what I wrote in my first comment about this is wrong in principle (under the non-equilibrium interpretation) but right in practice.

Comment #138 July 24th, 2017 at 2:44 pm

Scott #85 Awesome Post thank so much for your ongoing contribution to communicating science.

I have just one comment. You mention that it is dark energy (discovered as you mention in 1998) that limits our ability to see beyond about 20 billion lights years (its 46 billion actually). I dont think this is right. In any expanding universe there will always be a horizon beyond which spacetime is stretching stars away from us faster than the speed of light. Dark energy is effecting that bound, but there would still be a bound – the Hubble Sphere – without it.

Comment #139 July 24th, 2017 at 3:44 pm

This question can be mathematically sharpened to:

The short answer is “yes”.

Physically, this principle identifies the thermodynamic notion of a zero-temperature reservoir with the informatic notion of an optimal feedback controller. Conversely, non-optimal feedback controllers are identified with finite temperature thermodynamic reservoirs.

So in a nutshell, temperature is a control parameter, and an empty vacuum is an optimal controller that drives systems to zero temperature.

Mathematically, this principle can be applied to prove concrete algebraic integral identities that link Q-representations of quantum thermal states to positive P-representations of those states.

Computationally, this principle helps us to appreciate why coherent states are generically effectient in computationally unravelling the nonequilibrium dynamics of finite-temperature systems.

Comment #140 July 24th, 2017 at 4:09 pm

John Sidles #139: No, my question was specifically about Bohmian mechanics.

Comment #141 July 24th, 2017 at 4:21 pm

Scott #93, #1, reversibility, is an input from fundamental physics, with real consequences, although thermodynamics can draw conclusions without it. The others are just details determining how we go about measuring information.

Comment #142 July 24th, 2017 at 4:36 pm

Thank you for the Bohm-context clarification, Scott.

Now I am interested in precisely how, in Bohmian mechanics, the identification of thermodynamic temperature with control optimality — an identification that is natural, useful, and illuminating in the Church of the Larger Hilbert Space — can be concretely demonstrated.

To paraphrase von Neumann’s letter to Marston (of #90)

Considerably to my amazement, an arxiv full-text search finds dozens of preprints that concern themselves with “Bohmian+entropy+thermodynamic*+control“.

Out of this large set, perhaps some Bohmian quantum mechanics expert can recommend specific references? This filtering would be very welcome.

Comment #143 July 24th, 2017 at 4:47 pm

A quick note about Bohmian mechanics. Even without any special access, one can both see the interference and know in retrospect which slit the particle went through. Since trajectories cannot cross, particles that went through the upper slit end up on the upper part of the screen and those that went through the lower slit end on the lower part.

Comment #144 July 24th, 2017 at 5:40 pm

Waterbergs #138: Thanks! I decided to investigate whether you were right. But the Wikipedia article on event horizons seems to back me up here. It defines cosmological event horizons, which sound like exactly the relevant concept, and then says:

Examples of cosmological models without an event horizon are universes dominated by matter or by radiation. An example of a cosmological model with an event horizon is a universe dominated by the cosmological constant (a de Sitter universe).

A universe without a cosmological constant would still have particle horizons, but those don’t represent an ultimate limit on where you can get signals from.

Of course, anyone who understands these matters is extremely welcome to set us straight.

Comment #145 July 24th, 2017 at 6:00 pm

Scott #137: If you are talking about a system about which you know the position — from the divine slip of paper — the answer is simply no. The evolution is deterministic, there is no way randomness will be introduced by evolving the system in time. If instead what you have is a probability distribution over the positions, then yes, the Bohmians always claim that generically probability distributions will relax to |\psi|^2 under unitary evolution, no appeal to measurement apparatuses needed.

But I think I understood what the confusion is about: in your first comment you were thinking about a situation where in every round of the experiment the gods will give you the \emph{same} piece of paper. Then the photon would always go through the same slit, which would be a gigantic contradiction with quantum mechanics, and only be acceptable in non-equilibrium Bohmian mechanics.

However, I think the best way to model this situation is simply to have a position x_i in each round i, in such a way that in every given round you do find the particle going through the slits that the gods promised that it would go, but if you take the ensemble average over x_i you find that they are distributed according to |\psi|^2, and therefore the probabilities you find are distributed according to quantum mechanics.

In this way we can do everything with plain vanilla Bohmian mechanics, no mention to non-equilibrium needed.

Comment #146 July 24th, 2017 at 6:52 pm

Impatient reader here :0)

Any resolution on the Tim v. Daniel front? Are we seeing an unravelling of the argument, loss of interest in defending / attacking the argument, or a successful defense of the proof?

Thanks!

Comment #147 July 24th, 2017 at 7:13 pm

Mateus #142: No, I was thinking about a situation where you have a divine slip of paper for the positions of the particles you’re measuring, but

notfor the positions of the particles in your measuring apparatus, which are distributed in the usual |ψ|^{2}way. In such a case, it seemed completely plausible to me that an Avogadro’s number of “random” particles would overwhelm a tiny number of “non-random” ones.Comment #148 July 24th, 2017 at 7:46 pm

Ian #143: Here’s my current thinking about it:

1. The

conclusionof the argument—that (with a suitable light-sheet definition) number of qubits scales at most with surface area in Planck units—is a cornerstone of modern physics. It beggars belief to me that this conclusion would be wrong because of some simple error that was overlooked by everyone for 20+ years until the comments section of this blog post.2. Having said that, the usual

argumentsfor the conclusion are much more technical than my attempt at a conceptual verbal summary in the post. If they’re more technical, it’s probably for a good reason. It’s possible, even likely, that my verbal summary misses some important aspects of what’s going on—even if the things it misses were not egregious enough for (say) John Preskill or Sean Carroll to have flagged them.3. The steps in the argument that have attracted the most criticism—namely, 4 and 7—are precisely the steps that I myself was most stuck on and needed Daniel Harlow to explain to me. I’m still in the mode of asking questions of both sides in an attempt to get to the bottom of it.

4. As I explained in comment #130, even supposing we dropped steps 4 and 7, we’d still be left with a powerful statement about “the physicality of information”: in this case, an upper bound on the

number of computational stepsthat could take place within a given time interval without creating a black hole. It just wouldn’t be quite the same statement that I’d claimed.Comment #149 July 24th, 2017 at 10:11 pm

Ian #143

I haven’t found Daniel’s later post responsive. So that’s where I stand.

In response to Scott: as you know, I currently have a paper arguing that the whole “black hole information loss paradox” is due to a pretty simple error that occurred more than 40 years ago. So my belief is not easily beggared in this context. (Just today I think I have sewed up the last loose ends of that paper.)

Regarding your point 4, I can’t see where even the conceptual apparatus comes from. If you insist on analogizing the physical time development of the universe to a computation, it is an analog computation (so long as we are using differential equations) not a digital or Turing machine one. I have no idea how to count “computational steps” in such a setting.

Comment #150 July 24th, 2017 at 10:11 pm

One of the greatest blog posts in the history of physics.

Comment #151 July 25th, 2017 at 12:05 am

Ian #146, the short answer is that I have better things to do than debate with would-be-skeptics who can’t be bothered to understand free field theory. The argument Scott describes in points 4-7 is one intuitive way to think about a precise theorem in quantum field theory, which you can read about in https://arxiv.org/abs/0804.2182. It definitely isn’t wrong. My previous two comments give a bit more quantitative intuition, which may help some people.

Also to whomever asked about horizons in expanding universes above, Scott is right. In an expanding universe dominated by matter or radiation, if you wait long enough you can eventually see anything.

Comment #152 July 25th, 2017 at 1:49 am

Scott #147: That seems plausible to me, but I don’t see what this has to do with the subject at hand. We’re not talking about the evolution of the system after the measurement, but just about how the result of the measurement depends on the particle’s position.

And every single Bohmian paper I have ever seen just assigns a position to the particle, ignoring the hidden variables of the measurement apparatus, and from the quantum potential derives a deterministic trajectory that connects this initial position to the measurement result. Are you trying to argue that this somehow does not work? That the initial position does not in fact determines the measurement result? I’m very confused about what your point is.

Comment #153 July 25th, 2017 at 2:19 am

Scott #144

Hi Scott, thanks for your prompt and as always courteous reply to a pipsqueak raising an issue. But (you knew there was going to be but!) I do think that you are wrong on this one. There are two horizons in question here as I understand, an event horizon and a particle horizon. The later is the furthest point from which light emitted in the past can have come and reached us, and is, I think about 46 billion light years away from us. This seems to be the horizon you are refering to in comment #85.

The exsitence of such an horizon doesn’t depend on the cosmological constant – which has only really kicked in at about 9 billion years into the expansion of the universe, as it grows in influence as the amount of spacetime grows. It has just speeded up the expansion a bit. I think in most expanding universes (even with no cosmological constant) such an horizon will exist. If one thinks of the 2D analogy of the surface of an expanding balloon, every point sees every other point receeding with an apparent velocity proportional to the amount of spacetime between the two points. So to get an arbitrary large apparent recession velocity (such as one greater than the speed of light) one just needs to go far enough away. I think the same applies in our universe with 3 spatial dimensions.

As you say, maybe we need a cosmology expert to kick in here and clarify for us.

Comment #154 July 25th, 2017 at 2:48 am

Tim #116: I think the issue is that if you have a particle with zero (or very low) momentum, its wave function is spread out, so its probability of being in a particular region of space is zero (or almost zero), and so the information contained in that region of space coming from that particle is very small.

Comment #155 July 25th, 2017 at 10:10 am

Will #154

My point was just that the spatial variation in one frame at a moment plus the Lorentz transformation does not yield the temporal variation in any frame. The example is just the easiest to see, but the point is universal.

Comment #156 July 25th, 2017 at 10:22 am

Mateus #152: I’m no longer trying to win an argument; I concede that the situation with the slips of paper can be understood in terms of non-equilibrium Bohmian mechanics. My curiosity was pulled in a different direction: even if (say) Valentini was right, and non-equilibrium Bohmian states abounded in our universe, would exploiting those states for superluminal communication and so forth still be a practical near-impossibility, because the interaction with the equilibrium states of the macroscopic measuring devices would pull the particle positions back to equilibrium anyway? This seems completely plausible to me, but it’s fine if you’re not interested to go there.

Comment #157 July 25th, 2017 at 10:28 am

Waterbergs #153: Rereading comment #85, the resolution is simply that I was sloppy there, and didn’t properly distinguish between particle horizon and cosmological event horizon. But when applying the holographic entropy bound, it’s clearly the latter that’s relevant: we care about the maximum number of bits that can ever be stored, and the maximum size of a computation that can ever be done. And we agree, I think, that the event horizon is finite in a de Sitter universe, but infinite in a universe with no cosmological constant.

Comment #158 July 25th, 2017 at 10:50 am

Scott #155: I still don’t see how the macroscopic measurement device pushing the state back to equilibrium would affect the capability of predicting the measurement result. For me it is clear that this would make the post-measurement state unpredictable, but we’re not talking about the post-measurement state, are we?

What I understand by having a non-equilibrium state is having a capability to prepare an ensemble of quantum states with enough control over the positions so that their ensemble average is not distributed as |\psi|^2, but something much sharper. Since a deviation from this distribution must cause a deviation from the Born rule probabilities, and a deviation from Born rule probabilities imply superluminal signalling and assorted fireworks, it seems to me that having non-equilibrium states mean having the ability \emph{in practice} to do this stuff.

Comment #159 July 25th, 2017 at 10:52 am

Is there any Bohmian hanging around that knows the answer to this?

Comment #160 July 25th, 2017 at 11:14 am

Scott #148

” in this case, an upper bound on the number of computational steps that could take place within a given time interval without creating a black hole.”

I guess that analysis also takes care of the cases where one tries to use special relativity to take advantage of time dilation? (i.e. the energy necessary to accelerate close enough to the speed of light makes it too expensive).

Comment #161 July 25th, 2017 at 12:30 pm

Douglas #141: Every physical theory has a conceptual core, surrounded by an accretion disk of auxiliary facts that are crucial to applying the theory in practice.

Now, I’m on record as saying that I

alwaysprefer to learn about physics by starting at the core of a theory, even if that’s completely ahistorical—even if it took decades of clearing away crud in the accretion disk before the core even came into view. E.g. first learn that states are complex unit vectors and evolution is unitary, and only later learn about bosons and fermions, e=hv, and the uncertainty principle.Thermodynamics, it seems to me, follows exactly this same pattern—except with the interesting twist that almost

allthe actual physics input belongs to the accretion disk, the core being almost entirely pure math (again, except for evolution being reversible). So, yes, start by teaching the information-theoretic core! But then if you want to say something about heat and temperature—which is the part of the story that always confused me, precisely because it’snotcompletely information-theoretic—you need to move out to the accretion disk.Comment #162 July 25th, 2017 at 12:32 pm

fred #160: Yes, that’s right.

Comment #163 July 25th, 2017 at 12:38 pm

Mateus #158: No, I wasn’t talking about the post-measurement state, but about the classical measurement outcome. My point was that, as far as I can see, there’s no principle telling us that even the measurement outcome won’t get pushed back to “equilibrium” (i.e., to the Born probabilities) during the process by which whatever observable we’re measuring gets recorded macroscopically. I’m sure there are papers by Bohmians analyzing this issue; I just don’t know where they are or what conclusions they reach.

Comment #164 July 25th, 2017 at 1:22 pm

Scott and Fred #160 & 162 “i.e. the energy necessary to accelerate close enough to the speed of light makes it too expensive”

This is a very good clue why the cosmological constant is tiny today, but still consistent with the 10^124 number at the time of the big bang. Even today relativistic physics implications are hard to wrap your head around so that ALL its implications are understood.

Comment #165 July 25th, 2017 at 1:28 pm

@Tim #155 Sorry, I misunderstood. We have to distinguish two claims

1) The spatial variation of a field in some Lorentz frame determines the energy in that (or some other) Lorentz frame.

2) The spatial variation of a field in some Lorentz frame lower-bounds the energy in that Lorentz frame.

The first claim is false. You correctly debunked it earlier with the example of a zero-momentum particle.

However, Scott didn’t make that claim – or at least not intentionally. Instead he made the second claim:

> If we know how quickly a field varies across space, then we can lower-bound how much energy it has to contain.

Do you disagree with this?

In the case of the wave function of a spin-free particle, we can see for instance in the equation Daniel gave. For a classical relativistic particle or substance, I guess this comes from the fact that the stress-energy tensor is nonnegative when summed against the metric tensor, so large momentum implies large energy.

There is an additional issue which is how Scott’s argument to verify this with a Lorentz transformation is supposed to work. As far as I can see, the right way to verify this with Lorentz conditions is to assume that the energy is nonnegative (or has some finite lower bound) in each reference frame. In this case, if the momentum is large but the energy is small, we can take a Lorentz transformation to a frame where the energy is arbitrarily negative.

So I think Scott’s claim is true, and his argument with a Lorentz transformation does at least point towards the actual argument.

Comment #166 July 25th, 2017 at 2:31 pm

Scott and Mateus: The exact initial position of the incoming particle would determine where it lands on the screen in a 2-slit experiment, and the exact initial positions of the pair of particles in a Bell experiment determine the exact pair of outcomes (together with a macroscopic description of the apparatus). The detailed apparatus positions are not necessary to determine the outcome. They could not be, or else the Bell correlations would fail. So Mateus is right on this point.

Will:I do disagree, and the same counterexample proves that not even a lower bound can be derived. Daniel pulled out the Klein-Gordon equation, and if you look you will notice that the equation contains an “m”. That is the rest mass of the particle. And that, of course, lower bounds the energy of the state by E = mC^2. But that lower bound has nothing at all to do with spatial variation taken by itself. From the spatial variation at a time you have no clue about what the relevant rest mass is.

Daniel has retired from comment. But I will note one more thing. The paper he posted as somehow rigorously establishing the conclusion is all about the *von Neumann* entropy of states. The von Neumann entropy has exactly nothing to do with the thermodynamic entropy, with temperature, or with energy. Mixing up von Neumann entropy with Shannon entropy with thermodynamic entropy with statistical-mechanical entropy (either Gibbs or Boltzmann) is more or less a hallmark of this literature starting with Beckenstein. A proper analysis has to begin by carefully separating these distinct notions of “entropy” and arguing what connection—if any—there is between them. That will not get settled here, but if you want some advice before reading a paper, be sure to see how clear it is made which entropy is at issue and why it is a notion of entropy relevant to the question at hand.

Comment #167 July 25th, 2017 at 3:05 pm

Tim #166: You might be completely right that the detailed apparatus positions don’t matter. Not knowing enough about Bohmian mechanics, I just raised it as a question; I didn’t claim an answer.

But I don’t understand the following argument at all:

They could not be, or else the Bell correlations would fail

If the probabilities of measurement outcomes were determined entirely by the quantum state, the Bohmian positions never even entering into it—for example, because the interaction with the measuring apparatus had “re-equilibrated” the positions—then clearly you

wouldsee Bell correlations, because standard QM predicts them.Comment #168 July 25th, 2017 at 4:21 pm

Scott # 167

The issue to think about is not whether the measurement *probabilities* are determined by the quantum state, with the positions not coming in, but what determines the particular *outcomes*, which are the things that display the empirical statistics. What I was thinking of was just the simple EPR perfect correlations. The particular results of such an experiment—whether it is Up-Down or Down-Up—depends on the particular initial location of the particles. Through the entanglement in the wave function, as soon as one of the particles interacts with a measurement device the conditional wave-function collapses to one state or the other, depending on the position of the particle interacted with. My thought was that if the particle positions in the apparatus, in addition to the position of the measured particle, also determined the outcome then there would be no guarantee that the conditional wave function would have the right form to enforce the perfect correlation on the other side. Now that I have written it down, though, I can see ways around it.

In the usual analysis, the measurement device is just represented by a potential term in the Hamiltonian, which obviously is independent of the detailed particle positions in the apparatus, and in that analysis everything comes out fine. I have to think a little more about whether one could easily compensate for having the particle positions in the apparatus play a critical role in determining the outcome. I guess I can say for sure that in the usual analysis they certainly don’t.

Comment #169 July 25th, 2017 at 4:27 pm

Regarding the double slit experiment – interference + detection of which path was taken:

https://en.wikipedia.org/wiki/Afshar_experiment

Seems consistent with

https://en.wikipedia.org/wiki/Transactional_interpretation

hmm…

Comment #170 July 25th, 2017 at 5:12 pm

Thanks for this post. Just a simple remark: a constant field can affect the information content of the universe is a much less trivial way than suggested in this post. If that field has a non-zero potential energy, it will contribute to the energy content as dark energy, which in turn affects the size of the cosmological horizon, in turn affecting the total amount of information visible to any particular observer (even an immortal one).

Comment #171 July 25th, 2017 at 11:55 pm

fred #37 and so on:

Information as a concept *is* divorced from consciousness. The typical example is DNA which encodes the information necessary to produce an organism. No consciousness required.

Comment #172 July 26th, 2017 at 3:14 am

First of all, great post! I am not very knowledegeable about these things, but I have one query:

> Why is this assumption justified? “Because experiment bears it out,” the physics teacher explains—but we can do better. The assumption is justified because, as long as the degrees of freedom that we’re talking about all interact with each other, they’ve already had plenty of time to equilibrate.

Why do you assume that all degrees of freedom in this universe will interact with each other? What if, for example, by accident someone comes up with an instrument which detects a new type of behaviour, and its degree of freedom is totally independent of the ones we know today?

Comment #173 July 26th, 2017 at 8:01 am

#71 RKM

That’s a good point.

DNA, memory, a thermostats, neurons, are all micro systems having a state that’s dependent on (hyper sensitive to) some global properties of outside macro systems.

E.g. a thermostat is dependent on the room temperature it’s in.

E.g. the state of a handful of neurons in the brain of a cosmologist is highly dependent on the shape of very distant clumps of matter (galaxies).

One could find what those dependencies are through careful experiments – e.g. finding that something shaped like a cat, a chair, or a cloud, or a certain sound is a sort of invariant in the studied system’s state changes. Then possibly build a growing list of such concepts/symbols, and then infer a dictionary representing the “world vision” of the studied system.

That said, to make sense of it, we would have to be able to interpret it in terms of the structures in our own brain.

In that sense the interpretation of information is always subjective – because it’s always done by some other sentient system (one mystery is how life kickstarted this, or the question of when a baby/fetus becomes conscious).

E.g. if I handed you an embedded microchip system, you would have a tough time deciding whether it’s been programmed in a brand new high level programming language or programmed by hand in low level assembly, or just mass produced as is. There’s no denying that there’s “information” in there, but how to interpret it, at what level of abstraction is subjective on the observer (a computer works by just “running” the laws of physics on its atoms, there’s no such thing as software “running” the hardware).

E.g. the understanding of mathematics of Ramanujan may not be entirely transferable to another average human brain (is all mathematical knowledge always expressible as symbols on a 2D sheet of paper, which is the standard way for humans to encode math information?).

E.g. Go strategies used by Alpha Go may not be apparent/make sense to human players.

That’s a real issue going forward with AIs because we would like to understand (at least qualitatively) what motivates the conclusions/decisions of an AI, even if it always appears to be right. If we ever give up on this, and accept those AIs as impenetrable black box oracles (gods?), humanity would lose a lot and starts to stagnate. There is already research done on adding extra AIs on top of such AIs just to interpret its decisions in ways that are more intelligible to us (adding recursive introspection akin to consciousness?).

Comment #174 July 26th, 2017 at 9:18 am

Nikhil #172: Please read further—that question is answered in the post!

(Basically, a totally decoupled degree of freedom doesn’t matter for thermodynamics. And if we’re doing equilibrium thermodynamics, then an interacting degree of freedom is assumed to have already equilibrated.)

Comment #175 July 26th, 2017 at 10:15 am

Just saw an ad for a book likely to be of interest to those in this discussion:

Picturing Quantum Processes

A First Course in Quantum Theory and Diagrammatic Reasoning

Comment #176 July 26th, 2017 at 10:24 am

Maybe another comment about Bohemian mechanics will help. It makes no sense to say of an individual particle, or the particles in an individual measuring device, that it is (or they are) “in equilibrium” in the relevant sense. What can have an equilibrium distribution (or not) is a large *ensemble* of systems which all have the same conditional wave function. It is contentful to ask of such a large ensemble whether the particle positions are (approximately) psi-squared distributed (relative to the conditional wave function). It is this condition that assures that the empirical statistics in a Bohmian universe will match quantum-mechanical predictions. Asking whether the state of a *single* system is or is not in quantum equilibrium is a category mistake. You can only ask of large ensembles. Asking about a single system is the analog in stat. mech. of asking whether an individual atom in a gas is “in equilibrium”. The gas can (or cannot) be: for the atom it makes no sense.

Comment #177 July 26th, 2017 at 10:25 am

Scott:

I was waiting for someone else to notice this thing, but apparently none has, not at least very directly or saliently. Guess there still is time to point it out. So, OK, here we go.

Scott says in the main post:

>> “And suppose She then created a bunch of new fundamental fields, which didn’t interact with gravity, electromagnetism, or any of the other fields that we know from observation”

A strawman, that one is.

The real issue is not whether God or Mother Nature could or could not create (or (choose to) reveal) a field that

does notinteract with any of the forces/fields already known to us.The real issue is this:

What if there is a new form of force (in whatever form: as a field of force, or as a “particle” of force, or in whatever other form) which

does(and has been) in reality interacting with the known forms of forces, but only in such regimes of parameters that our observations (including the technological limitations of our instruments) thus far have not been able to detect either the phenomena involving it or its interactions?Thermodynamics is both a scheme (or to use that much abused word: “framework”) and a body of content (a set of facts, and valid inferences therefrom).

Qua a scheme, thermodynamics would be quick to absorb the new force within itself—for instance, it could simply add a new form of energy in the law of conservation of energy.

By the same token, qua a body of content, the list of forms of force/energy it

presentlyacknowledges isnotclosed. We can’t ever bring a “closure” like that toanybranch of physics. Our list is “complete”—but only within the scope of our present state ofinductivelyderived knowledge.In short, we

canconfidently speak from our knowledge—about the aspects of reality we have already seen. But wecannottherefore assert that we have seen all thereis, in reality. There is no principle which allows us to do that.Any argument (e.g. one involving “degrees of freedom” or “dynamics” or “equilibrium” or “interactions”) must be consistent with these, more fundamental, facts. And the way to do that is simple: drop the strawman, and add the preface (even if only implicitly): “within the scope (“ambit” etc.) of our current state of knowledge…”. The main hypothesis would be quite fine then.

Sorry, too long a comment, but didn’t know how to compress it—all its (what I think are)

pertinentaspects!Best,

–Ajit

Comment #178 July 26th, 2017 at 10:25 am

And I even corrected “Bohemian” to “Bohmian”!

Comment #179 July 26th, 2017 at 10:44 am

Ajit #177: The surprising part is that, whatever new forces or particles might exist in our universe, one can upper-bound (for example) how much energy they could possibly contain, because all energy interacts with gravity.

To me, this is what’s actually much more surprising than the existence of dark matter: namely, that even with our present ignorance, we can upper-bound the

amountof dark matter, as “merely” a factor of 3 or so larger than the types of matter we’ve accounted for!In a similar spirit, sometimes people worry about a “turtles all the way down” scenario, where there would be no fundamental theory of physics or fundamental constituents of matter, just nuclei made of protons and neutrons made of quarks made of … and so on ad infinitum. But this is already ruled out! The existence of the Planck scale, and the associated Bekenstein-type bounds, put a clear cap on how far down things can go.

Comment #180 July 26th, 2017 at 11:37 am

Scott #179:

1. I respectfully disagree.

2. All energy does interact with gravity (and with any other form of known forces). A co-existence without any form of interaction is nothing but a “gap” in the knowledge.

But the mere fact of existence of interactions does not mean that all forms of energy interact with each other to the same degrees in every possible regime of physical parameters (and therefore can get detected in the regime currently accessible to experimental observations).

We cannot rule out the idea that in the regime of experimental parameters currently available to us, the degree of the interaction of the new XYZ form of force is too small to be detected.

3. I gather (but don’t really know) that the Planck scale is the lower-most bound for the scales on which known force-phenomena can be studied.

OK, I can accept that on “faith”. But only with an emphasis on the word

“known”.4. The argument for the infinite regress of phenomena at ever finer (or larger) scales also is easy to counter.

There is no positive (actually, inductive) evidence in favor of such an assertion. Indeed, inasmuch as it is an

infiniteregress, it can only be a mathematical statement, not of physical facts (or of completed cognitive processes). On that one count alone, it can be rejected from physics.At the same time, a closure for the known forms of physical forces also has not been inductively established.

The latter (absence of the closure) does not mean the former (infinite regress).

5. In one respect, mine is the same position as you yourself have argued so well in favor of, in the following passage:

>> “…If the cosmologists revise their models next week,…”

The outer extent of the known universe cannot be taken as an actually existing boundary of the physical universe. Similarly, for the forms of forces there can be. An unknown form of a force was a possibility in ancient times when the weak interaction had not yet been discovered. It remains so, even today.

Best,

–Ajit

Comment #181 July 26th, 2017 at 11:55 am

Ajit #180: OK, but I never said that there were no new fundamental particles or forces to be discovered. It would shock me if there weren’t! I only said that there’s a known bound on the total energy content of any new particles and forces, and also on the energy scales at which they can appear. The two statements are 100% compatible.

Comment #182 July 26th, 2017 at 12:05 pm

This view is contrary to the literature, both historically and logically. Historically, von Neumann derived his celebrated entropy-expression as a logical consequence of a

Gendankenexperimentcrucially involving the classical thermodynamic entropy of an ideal gas.The student-friendly literature that is cited in comment #97 works through the details of von Neumann’s derivation.

The short summary is von Neumann’s logical reasoning about “ω-gases” established, way back in 1935, that the entropy of irreversible mixing — as presented equivalently by the random classical mixing of molecules (e.g. benzene and toluene), and by the random quantum mixing of states (e.g. radiative decoherence), and by the random combinatoric mixing of symbols (e.g. Shannon noise) — is a universal phenomenon that is compatibly grounded in classical, quantum, and combinatoric dynamics.

It is natural to wonder whether 21st century researchers will discover further expressions for entropy — expressions that are novel and yet compatible with classical, quantum, and combinatoric entropy-expressions of the 20st century.

For sure, present-day quantum gravity researchers and varietal dynamics researchers (aka “quantum quasi-skeptics”) have concrete ambitions in this regard! 🙂

Comment #183 July 26th, 2017 at 2:56 pm

John Sidles # 182: My remark had to do with the conflation of these various notions of “entropy” in the literature since Beckenstein. Perhaps “nothing to do with” is too strong, but the von Neumann entropy is not the thermodynamic entropy. This is obvious in that the von Neumann entropy of a system is not even extensive: the parts can have much, much higher entropy than the whole. Entropy of mixing can be given a clear sense in statistical mechanics, but the thermodynamic implications of it per se are problematic (hence Gibbs’ paradox, which is a special case of the “mixing paradox”). Likening issues of entropy to issues of Shannon noise is again a strange comparison. The Shannon entropy is definable even if there is a noiseless channel, so it is not as if the Shannon entropy somehow quantifies noise. In any case, mixing is hardly a central notion in thermodynamics, so trying to find connections here through it is already somewhat implausible. What I warned against was mixing up these entropies, and I think the warning stands. The story about how Shannon’s got its name does not need repeating here, but ought to be appreciated.

Comment #184 July 26th, 2017 at 4:14 pm

This argument for information –> energy begins by assuming that information is variation across space, and that “increasing information” is “packing the variations in tighter”.

But how can this include information such as an electron charge? The information of the electron charge comes out in that, when we place the electron in such and such field, it moves in such and such way. The information of charge seems to be “stored” in counterfactual worlds — if we did this, we’d have that — rather than “stored” in variations across space.

Comment #185 July 26th, 2017 at 4:37 pm

Tim (#183), your observations at least suggest that it would be an instructive exercise to (attempt to) demonstrate, in von Neumann’s ω-gas framework, that the following three entropy functions:

(1) classical entropy of an ideal gas,

(2) von Neumann quantum entropy, and

(3) Shannon’s entropy function,

in aggregate constitute (what is called) “a Kählerian triple”, which is to say, a set of three structures of which any two structures compatibly specify the third.

Also commended to younger

Shtetl Optimizedreaders the (true? or at the very least, much-celebrated!) physics story about “What von Neumann Said to Shannon“Needless to say, history has proven von Neumann right! 🙂

Also commended is the recent Simons Institute/

Quanta Magazinearticle in which “Kähler structures” appear, namely Kevin Hartnett’s profile of the algebraic geometer June Huh, title “A Path Less Taken to the Peak of the Math World“:Without any pretense that I myself grasp the specific (Hodge-theoretic) Kähler package that Huh and Adiprasito are exploring, it’s nice to appreciate that quantum mechanics is blessed with a particularly wide-ranging Kähler package, namely the compatible triple of symplectic, metric, and complex structures on its state-space.

In essence, whenever we contemplate the fundamental Kählerian quantum triple of quantum dynamics, quantum measurement, and quantum operator algebra — or any of the many compatible triples that derive from this fundamental triple (including entropy-function triples) — our mathematical appreciation of any two of these attributes naturally extends our appreciation of the third attributes — sometimes in surprising ways.

This Kählerian quantum package is manifest both in the Hilbert space that quantum supremacists cherish, and in the varietal state-spaces that quantum quasi-skeptics cherish. For me at least, the existence of this shared Kählerian package provides a concrete reason why there need be no fundamental mathematical incompatibility between (triple-respecting) quantum supremacists and (triple-respecting) quantum quasi-skeptics.

To borrow a phrase from Melville “In this we can all splice hands … mathematically!” 🙂

Comment #186 July 26th, 2017 at 5:53 pm

QuanThomas #184: It’s a good question, though it seems effectively the same as the question I answered for wolfgang (see #56 and #63). As I was trying to say there—I probably didn’t do a good job—what’s bounded by Bekenstein-type bounds is effectively the number of bits that you can set arbitrarily and then store in a hard drive with given dimensions, in such a way that they can be reliably retrieved later. It’s not the number of bits needed to describe the laws of physics themselves with all their parameters. The latter we think of as simply being O(1)—or

maybethere’s an infinitely long message from God encoded into the electron/proton mass ratio, or something like that. But even if so, it would take exp(n) time to decode the first n bits of the message, so it’s not as if it’s very easily accessible. Nor would we, as beings within the universe, be able to modify the message at will (something thatwouldviolate the Bekenstein bound).Comment #187 July 26th, 2017 at 6:31 pm

John (# 185) The idea that these different notions of entropy form a Kählerian triple is, how shall I say it, a bold conjecture. I would be more than astonished if it were true. To paraphrase Bell, if someone offers a proof I will listen, but I would not attempt such a proof myself.

Comment #188 July 26th, 2017 at 11:27 pm

Scott # 181:

1. What if, as Dirac pointed out, the supposed universal constants like the G’s and h’s are

notconstants? What if they arefunctionsof some other, more fundamental, physical quantities (or parameters)?Further, what if the regimes involved are such that a significant variation in their magnitudes (from their currently existing values in the evolution of the universe) is required before an interaction of the new force can at all become significant enough that we can at all detect it?

Would the lower bound prescribed by us on the total energy content for that new particle/force still hold? also in other regimes? even if our prescription itself is only based on the

assumptionof the constancy (and currently measured magnitudes) of the G’s and the h’s?If you still say yes, I would have nothing more to offer on this sub-thread.

—

2. However, even if a bit off-topic, I would like to share something here, for developing an appreciation of what the term “regime” means.

I would suggest going through the zoo of the many different

stableregimes (26, at one count!) that have already been experimentally observed even in such a simplest system as the Taylor-Couette flow, i.e., the flow of fluid in an annular gap between two concentric and rotating cylinders (like the flow of the lubricating oil trapped inside a cylindrical bearing). Here are a few refs that give the phase diagrams and the pictures: (i) https://en.wikipedia.org/wiki/Taylor%E2%80%93Couette_flow, (ii) http://chaos.ph.utexas.edu/manuscripts/1087323605.pdf, and (iii) “The hydrogen atom of fluid dynamics” available at http://perso.ens-lyon.fr/christophe.perge/Fardin_Perge_Revue_SoftMatter_2014.pdf.(It’s very easy to produce some of these patterns. My own UG students built an apparatus to do that during their BE project last year.)

—

… 3. Am more or less signing off this thread (though it’s been a very interesting one), but of course would check back a few more times…

Best,

–Ajit

Comment #189 July 27th, 2017 at 2:11 am

Scott #174: Ah sorry, I probably skimmed over that part. Thanks!

Comment #190 July 27th, 2017 at 4:13 am

QuanThomas #184: A bit tangentially related, but there is no information encoded in the electron charge, as we can define it as being 1 and all other chargers are integer multiples of it (or some simple fractions in the case of the quarks).

To get your argument going you need some adimensional constant that is actually (as far as we know) a real number, like the ratio between the proton and electron masses, or the fine structure constant.

Comment #191 July 27th, 2017 at 10:39 am

Could you expand on your sidenote in comment #19? I see (2.9) and (2.20), in which S_matter <= 2*pi*E*R, but with a fixed density E ~ R^3 and so the exponent here is 4, not 1. The only times I see 3/2 in the paper are as an exponent of A, which ~ R^3. Where should I be looking?

Comment #192 July 27th, 2017 at 12:33 pm

Scott #130: It seems to me that Tim #108 is related to mine #125, but I couldn’t follow everything.

To my previous #125, I just would add that indeed, if there are no boundary conditions, a stationary “state” of the field would be space invariant. Then, by definition, Daniel’s point 1 “anything in the physical world that varies in space—say, a field that encodes different bits of information at different locations—also varies in time, ” is correct, but there is no need of “, from the perspective of an observer who moves through the field at a constant speed.” Indeed, if the state is not space invariant, then it cannot be stationary. But this is not what Daniel wants. If I understood correctly, he want to show that space variations in a stationary case become time dependent regarded from a boosted observer. Right?

Last thing. In our derivation of free quantum field theory from informational principles for a infinite denumerable set of quantum systems, we also derive that the space occupied from a qubit (or a bunch of them) is a Planckian volume, and the time between two cell updates is Planck time. The fact that we recover the three standard (MKS) from a purely mathematical dimensionless theory is a consequence of a phenomenon related to discreteness in conjunction with unitarity, namely that the particle mass jhas a maximum value. The interpretation of such mass as a mini black hole comes from the fact that at the maximum mass there is no evolution of the quantum state of the quantum network, as for a mini black hole, thus interpreting the maximum mass as the Planck mass. The other two dimensions (time and space) come from the small-wave-vector limit of the theory. Physics is information!

I love your blog.

Best

Mauro (my second name is the one commonly used)

Comment #193 July 27th, 2017 at 12:51 pm

Charles #191: For the derivation of the ~R

^{3/2}upper bound for the ball of radiation, see page 11 of Bousso’s paper, left column (Bousso phrases it as A^{3/4}, where A is the area).Meanwhile, the O(R) upper bound for massive particles is simply the standard Schwarzschild limit: when a system’s mass exceeds R c

^{2}/2G, it collapses to a black hole.Comment #194 July 27th, 2017 at 5:00 pm

John #185

“[…]nobody knows what entropy really is[…]”

haha.

I never really understood entropy (well, that’s not going to surprise anyone).

I take two dices (A and B), I throw them, I observe that coming out with a total of 6 is more likely than coming up with a total of 2. Therefore “total of 6” has more entropy than “total of 2”.

But, to me, this seems to also point out at what I was saying that information is all in the eye of the beholder:

A total of 6 is really one of these:

6 = A1+B5 = A2+B4 = A3 + B3 = A4 + B2 = A5 + B1

A total of 2 is only:

2 = A1 + B1

“total of 6” is a statistical state, which seems only more likely than “total of 2” because we assume that we, as observers of the system, can’t distinguish the individual instances adding up to 6 – all the possible states with a total of 6 have the same amount of information to us.

So, it looks less likely that all the air molecules in a given room will all gather into a corner of the room, but that “extreme” state is really no less likely than any specific state where all the air molecules are evenly distributed.

And, we never see an exploded egg suddenly reconstituting itself spontaneously, but this isn’t any less likely than seeing two (identical) whole eggs breaking apart in exactly the same way either 😛

Comment #195 July 27th, 2017 at 5:08 pm

[addendum to #194]

… and the funny thing is that we, as conscious beings, rely on memory in order to perceive the arrow of time (i.e. perceiving eggs braking apart… entropy increasing with time moving forward).

But memory basically behaves like eggs that get “unbroken”: from a statistically disordered environment we somewhat build ordered structures (decrease in entropy).

So, consciousness allows us to undo what has been done (at least in our mind)?

Comment #196 July 27th, 2017 at 6:23 pm

Fred #194,#195

It seems to me that you are conflating information with semantics. Information theory does not deal with semantics, even though you want it to. But like a lot of science this divorce which seems unnatural to the average person (e.g. can you look at a sentence without reading it and thinking about its import?) is necessary to objectively analyze the information in DNA, disc drives and so forth. Consciousness is pretty cool, but it is the reason you are conflating these things. Think: the universe makes a huge probabilistic distinction between the broken egg and the unbroken one. (Using shorthand here, hope you get the point, we’re talking about probability of one state changing to another) It is your memory–your consciousness–that allows you to play a sequence backwards and forwards and thus to you the two events seem somehow similar in probability. But they are not.

I believe this is why Dennet calls it an illusion–we use consciousness to make sense of the world but in so doing we also introduce big biases.

Just remember, too, everything your brain does requires energy, so the system as a whole is not decreasing entropy.

Comment #197 July 27th, 2017 at 7:51 pm

@Ajit R. Jadhav [#188]

Assuming that the variation in the constants is negligible in the volume in question, the argument should still work, giving the “same” bound. Of course the value of the bound changes depending on the values of the constants.

Comment #198 July 27th, 2017 at 11:21 pm

Ben # 197:

Yesssssss!

Comment #199 July 28th, 2017 at 9:49 am

RKM #196

Well, even if you take semantics out of it – if you consider randomness (the flip side of information), there are plenty of fundamental difficulties with trying to quantify it in some “absolute” way (e.g. Kolmogorov complexity is uncomputable).

Comment #200 July 28th, 2017 at 10:44 am

RKM#196

“Think: the universe makes a huge probabilistic distinction between the broken egg and the unbroken one.”

Could it be possible to imagine a universe having a spacetime structure such that there are two halves, symmetrical of each other, each “starting” at the opposite end of the space time, occupying their own half of space and time?

Those two halves eventually coming into contact near the middle of the spacetime block, with observers in each pocket seeing eggs breaking in reverse in the opposite side – the two halves mixing into each other slowly, eventually leading to a sort of zero entropy/frozen equilibrium state in the middle of the space time.

A==========\

……………\==========A’

Comment #201 July 28th, 2017 at 2:04 pm

Those readers not quite up to date will enjoy:

“A brief history of Quantum Alternatives” from today’s Ars Technica:

https://arstechnica.com/science/2017/07/a-brief-history-of-quantum-alternatives/

Comment #202 July 29th, 2017 at 4:00 am

Tim #30, Scott #39,

Sorry to go back to the earlier part of the discussion, got here late… playing catch-up. You make a nice point;

“The amount of information stored on any Cauchy surface in GR is the same as the amount on any other, since the state of any Cauchy surface is entailed by the state of any other (+ the Einstein Field Equation).“ Then its applied using a variation of the ‘twins paradox’ from special relativity; pushing things into the boundary of a light-cone.

There could be complications reconciling this with Scott’s framework; the geometric object you push the “information” into will vary as a function of that same information, because it distorts the light-cone, as a forcing term in the Einstein-Equation.

Normally it would be good to have the information in the same ‘container’ independent of the informational contents, but this may be asking too much since it’s not clear a priori how to compare the containers, which live in distinct universes. So the first question is how to explicitly state the main problem; we need an explicit way to compare different spaces such that things can vary in a family of regions which are essentially identified. I’m guessing the intuition is that you can change the contents by taking out and putting back in bits and pieces, but since this changes the global space, including the container, it needs some elaboration.

There’s an analogous issue relating to time variation, the discussion makes clear that the information is represented by a field which varies in time. Classically, Shannon information would be represented by memory (in bits) fixed over time. So for it to represent (or be) information at all it might be good to at least have something like a periodicity property, (so that information could be recovered reliably). The Shannon information context here is a bit different from the physicists unitarity context. But from the discussion so far I get the impression that the contents of the information might affect the periodicity, so some kind of bound is desirable.

This isn’t really my area, so apologies if I’m way off-base. I’d be interested in one of these days understanding better the holographic framework for FT and GR.

Comment #203 July 29th, 2017 at 5:19 am

There is a big problem with your response to the skeptic’s argument.

Either, all you mean is that the amount of information in a blackhold is bounded *given the physical laws* or you mean it is somehow fundamentally true even when we consider physically impossible events. In the first case the skeptic’s challenge can be dismissed immediately since god intervening and adding those extra fields is itself a violation of physical law. In the second case you can’t assume that if these fields vary it creates spacetime-curvature because god just gets to come in and say ‘nah, you know what that rule about varying fields -> energy -> curvature only applies to those fields you already know about…these new ones I’m adding are special and don’t satisfying that rule.’

Comment #204 July 29th, 2017 at 6:38 am

Peter #203: What you’re missing is that there are more fundamental and less fundamental physical laws. Quantum mechanics and general relativity (and of course, whatever it is that unifies the two) are like the operating system of the world. Specific fields and particles (electrons, quarks, gluons, the Higgs boson…), and the Hamiltonians governing their interactions, are like various installed application programs. One could very easily imagine God installing different apps without tinkering in any way with the operating system—in fact, in some conceptions (string theory), that wouldn’t even require God, just a different set of random accidents shortly after the Big Bang.

The interesting part, then, is that the operating system only lets the apps use so much memory within a given region of space before it “crashes,” which it does by creating a black hole.

Comment #205 July 29th, 2017 at 7:55 am

sf #202: There is no need (or way) to put a “forcing” term into the field equations. Essentially all I am doing is using a different co-ordinate system with a different foliation given by the t = constant Cauchy surfaces.The geometry object (the Cauchy surface) is not a function of any information: it is a function of the co-ordinate system that can be chosen without reference to any information.

Comment #206 July 29th, 2017 at 8:19 am

Scott #204: Scott, do you have a compelling reason for your claim that QM and GR together make the operating system of the world? Why saying instead that QT (quantum theory of abstract systems) alone is the operating system, and take GR as a shell? If we believe that there exists a quantum theory of gravity, everything must obey the general rules of QT, as quantum fields do. Even in empty space the ontology is the quantum field, the vacuum being just a special state of it.

What about recovering gravity as an emergent phenomenon? Even in the case of a field theory, it would be still a subroutine as for other fields…

Comment #207 July 29th, 2017 at 8:59 am

Mauro #206: I actually agree with you that QM is almost certainly more fundamental than GR. But GR, if less fundamental than QM, is also clearly more fundamental than electromagnetism, QCD, and so on, not only because gravity determines the causal structure of spacetime, but also (the main property used in this post) because it couples to anything whatsoever that carries energy. So if we don’t count as GR part of the OS of the world, then like you say, we should at least count it as the shell, or the windowing system, or

somethingelse below the application layer! Without knowing exactly how it gets unified with QM, it’s hard to be more specific about GR’s place in the abstraction stack. But in any case, since the question wasn’t relevant for my response to Peter Gerdes, I didn’t enter into it then.Comment #208 July 30th, 2017 at 9:43 am

Tim #205, also Scott #186, Tim #30, Scott #39,

I will try (below) to answer your remarks by clarifying what I had said, but in the meantime it occurred to me — why not just apply the analytic continuation property of Einstein metrics to simplify your point? There’s clearly an infinity of scalar invariants coming from a power series expansion for the curvature tensor, of an Einstein metric, in any small nbhd of a smooth point. This seems to provide infinite memory capacity in arbitrarily small nbhds. To avoid temporal thickening – use the Cauchy problem to eliminate time dimension dependence, and reduce to power series in space-like slices.

I suppose the problem is that this requires pretty delicate measurements (eg. LIGO) which is why one needs a more uniform memory storage system; the QM constraint on measurement must rule out using high-order Taylor series coefficients. But why should using a larger domain help? The uncertainty-principle must play some role here, but how?

The discussion makes me wonder if the QFT bounds on energy localization, as claimed here, would imply some kind of curvature bounds for the Einstein metric, which could be used to apply a metric compactness theorem implying the upper bound on storage capacity given here.

To answer your remarks:

The “forcing” term refers to perturbation of the stress–energy tensor (RHS) corresponding to perturbation of the “information” being stored; linearizing the Einstein Field Equation, the perturbation should show up as a forcing term. I’m using the term loosely and more generally anyway, to refer to the RHS.

Near a black-hole type singularity the metric oscillates wildly. Likewise perturbations of the stress–energy tensor will produce wild metric oscillations, which is why I think your light-cones might be drastically affected by perturbation of the “information” being stored.

Basically I guess you have in mind a best-case scenario, where the metric is reasonably tame, whereas I’m trying to see what could go wrong in the general or worst-case scenario.

You can use the time co-ordinate to slice things, but without geodesics that globally minimize length you don’t easily get a nice space like co-ordinate system; of course you can restrict to small nbhds, but when you perturb the metric you need some uniform bounds to make your construction work it would seem.

I’m afraid this is getting too technical, whereas Scott’s aim was to put things in a more down to earth way, sorry again.

Comment #209 July 30th, 2017 at 12:29 pm

sf # 208 Nothing I did perturbs the stress-energy tensor or the metric, where here I refer to the four-dimensional objects. All I did was take a different Cauchy slice through the same 4-dimensional solution. The Riemmanian curvature on the induced 3-metric of the slice changes, of course, which is sort of the point.

I don’t know why you say that near the singularity the (4-?) metric oscillates wildly. The scalar curvature grows without bound, but why the oscillations?

Comment #210 July 31st, 2017 at 3:51 pm

If the universe is deterministic (with a dose of “pure” randomness), isn’t the total amount of information constant and equal to what was present in the initial state of the big bang?

Comment #211 July 31st, 2017 at 6:38 pm

fred #210: We’re talking here about information, not from the standpoint of a godlike being who sees the whole universe as a giant pure state, but from the standpoint of beings within the universe who do things like build hard drives and measure how many bits they can store in them. So in particular, at the very least there’s all the information that arose since the Big Bang in the form of quantum randomness (or for an Everettian, the information that picks out which branch we’re in).

Comment #212 August 1st, 2017 at 7:35 am

Scott #211

What about the energy cost of “preparing” matter to act as a suitable memory? (e.g. the cost of building the hard drive)

And then the cost of accessing the bits with the unavoidable trade-off between memory size and transfer speed, e.g. modeled as a series of concentric spherical shells around the “being within the universe” (aka the central processing unit)?

Comment #213 August 1st, 2017 at 10:39 am

Scott #211: In general, isn’t that the only standpoint worth talking about? Are there cases where it makes sense to prefer the godlike being standpoint?

Comment #214 August 1st, 2017 at 10:55 am

Tim #209, also Scott #10, Scott #39, Tim #30,

Since the goal is to ‘store’ a lot of Shannon information you have to allow for as many as possible states of the stress–energy tensor, or whatever its QFT stand-in/equivalent is. A random state is going, via the Einstein equations (EFE), to produce a random metric, which may be a better choice of terminology than ‘wild metric’.

The perturbations I referred to would arise when comparing different possible states, which goes back to the fact that Shannon information is being considered; decoding a message/state would essentially require considering comparison or perturbation of states.

Otherwise, my understanding of Scott’s post, and other comments is changing every day, but for what its worth I’ll try to correct a bit my previous comments.

Since Shannon information is being considered, I had presumed that any pair of informational-states (or messages) would be directly comparable. But the way I see things now, the goal of directly defining ’comparability’ of regions, as in my 1st comment, was misguided; there is no way to put things in terms of ‘uniform containers’ with varying contents, so to speak.

Now, after reconsidering various other comments, I’m under the impression that the most one can possibly hope for is a compactness theorem for the “class of solutions of the Einstein equations, compatible with the information localization conditions coming from QFT, (as a condition on the stress–energy tensor) and with boundary surface having a fixed area”. So, in the ‘metric compactness theorem’ remark in my 2nd comment, the “curvature bounds for the Einstein metric” I mentioned there should actually be corrected to what’s in the first quote above (in this paragraph). This gives, at best, a pretty weak comparability property, via Heine-Borel, with a finite set of clusters, in each of which arbitrary pairs are geometrically similar or comparable.

But given this much freedom to alter the Cauchy surfaces, my remarks on potential technical difficulties with deforming such Cauchy surfaces become much less pertinent. Otoh its interesting to consider how the deformations Tim suggested would fit in with the proposed compactness property; hopefully it doesn’t need to explicitly consider equivalence under time evolution.

Comment #215 August 1st, 2017 at 4:36 pm

#210-211

But imagine if all that quantum randomness was really pseudo-random and the Kolmogorov complexity of the universe was a constant! (Is this program-size constant the “pure” randomness you were referring to, Fred?) (Scott, who needs (2D or 3D) locality to hold all of the time?) Of course, I could be wrong, but as far as elegant theories go, that would be pretty nice.

But even if this highly-speculative deterministic theory is correct, maybe the fact that we are within the system means that we should continue to develop practical theories such as QM, GR, etc. for understanding and (probabilistic) prediction… but maybe it would also mean Quantum Supremacy won’t be reachable?

I think it will be interesting if we are able to build AGIs that run on classical, deterministic machine-learning algorithms… Will scientists still be as quick to dismiss deterministic theories of physics?

Comment #216 August 2nd, 2017 at 1:13 am

Jon K. #215:

but maybe it would also mean Quantum Supremacy won’t be reachable?

If so, then are you willing to accept the converse: that if quantum supremacy

isreached, possibly within the next couple years, then the speculation of a deterministic theory underlying QM will be placed under even more strain than it already is, and should perhaps be abandoned as scientifically fruitless even if metaphysically unfalsifiable?I think it will be interesting if we are able to build AGIs that run on classical, deterministic machine-learning algorithms… Will scientists still be as quick to dismiss deterministic theories of physics?

Why should they? Even assuming quantum mechanics perfectly true, unless the

brainis a quantum computer in an interesting sense (which very few people believe), why shouldn’t we expect machine-learning algorithms running on deterministic digital computers to be able, in principle, to achieve similar feats?Having said that, the point is presumably moot, since once those AGIs are created they’ll take over the world and solve the remaining physics problems for us. 🙂

Comment #217 August 2nd, 2017 at 8:04 am

Jon K. #215

I’m not sure what is the difference between considering a deterministic universe vs one that has randomness in it (QM dice rolls)…

I would think that this could at least been analyzed for simpler models.

E.g. we consider a NxN instance of “the game of life”.

What would be the total amount of information in it (assuming asking this makes any sense)?

How does this relate with the initial conditions (i.e. which cells start on and off)?

Then, if we add “pure” randomness (e.g. some of the rule randomly chose between two alternatives or some cells randomly flip with some probability), how does this modify the picture?

Comment #218 August 2nd, 2017 at 11:38 am

#216

Yes, if Quantum Supremacy is demonstrated, I will reduce my posts on hidden variable theories by at least 50% 😉 Too bad this agreement is like me being on the wrong side of a halting problem bet. (It may not be possible to ever rule out the feasibility of demonstrating Quantum Supremacy.)

As far as creating a classical algorithm AGI goes, if it’s achieved, I think it will sort of call in to question beliefs that many people have about humans and computers… specifically, that humans have “free will” and machines are not capable of being “conscious”. Yes, it wouldn’t directly contradict QM, but if we start seeing ourselves as biological robots without free will, it seems like we may extend that deterministic view to other aspects of the universe… and to possibly the entire universe.

It’s sort of interesting to imagine a really intelligent AGI giving us more insight into physics… Would it just give us better models of correlations in our universe? Would it chunk its understanding differently? (e.g. maybe it has no need for the notion of a quark for modeling the universe) I’m hoping an AGI would balance statistical insight w/ logical inference for understanding. Maybe it would start looking to see if certain algorithms are responsible for creating pseudo-random phenomenon previously only analyzed statistically… but I don’t think it would get to that kind of understanding only through gradient descent.

Comment #219 August 2nd, 2017 at 12:09 pm

#217

I don’t think you would get more out of a deterministic system by just allowing some randomness fuel here and there.

Informal proof by contradiction: Suppose you could get more out of your computing system with some random bits or random rules. Then, after the fact, note the precise values of your randomly generated bits (or rules), and hard code them into your computation and run your simulation again. You should get the same answer. But then you just demonstrated that you could have achieve that same output without randomness in the fist place. (I think Scott gave a talk about this where he talked about a toaster and the halting problem, if I remember correctly.)

So why does it matter? I’m just interested in whether our universe is deterministic or not. It may not affect what could be achieved in the universe either way, but it may affect how we see ourselves as humans and how we interact with each other. For instance, if we took the view that “there is a (historical & complex) reason why somebody does something or acts the way they do”, I think we might be more empathetic towards each other. I’m not saying empathy solves problems, but if more people in the population had this point of view, it may allow us to better reconcile our differences… like, we are all in this together. (e.g. If I had been born into the Drumpf family, and I had all this psychological baggage, I’d probably be acting the same way… or if I didn’t grow up in a good family environment in a nice neighborhood, I might be open to joining a gang, etc.)

Comment #220 August 3rd, 2017 at 11:22 am

Information cannot be physical because physics treats each object without distinction. While factoring bits are distinctly more complicated that bits of information needed for sorting.

Comment #221 August 3rd, 2017 at 2:19 pm

Jon K.

#216

Not sure it’s that clear cut.

If you bake in your random sequence, isn’t that only valid for the same input you’ve been using? Otherwise for a fixed sequence of random bits, I could feed your algo special inputs that totally destroy its performance (like for quick sort).

Also, isn’t it an open question whether randomized algorithms are more powerful than deterministic ones?

#217

I totally agree with your view on the value of determinism – once you recognize free-will as an illusion, it’s just way easier to shutdown hating and judging, and seeing that compassion towards everyone is the only rational answer.

It’s surprising that so many scientists who argue about free-will never tried to seriously observe how their own mind works… a lot of people would benefit greatly from going on a meditation retreat.

Comment #222 August 4th, 2017 at 11:14 am

This is great, Scott! I think you’re saying both that nature is entirely informational, and that this fact is observable.

I always interpreted Landauer’s “Information is Physical” as a call to computer scientists to put more physics into their foundations; particularly the cost of irreversible operations and the opportunities to do better (which today include quantum possibilities).

I found Sean #1 and the discussion around it interesting. I feel there is much more than an analogy between energy and information: energy defines the dynamics (the computation) and the energy scale governs the maximum rate of distinct state change (operations per second). It seems to me this makes energy the fundamental informational quantity in physics.

Even classically, energy relates equivalent computations. For example, if you have two very different systems with analogous dynamics—say an electrical circuit and a mechanical system—very different variables are mapped onto each other, but energy always maps onto energy, since it determines the dynamics.

In quantum mechanics, the finite-state character of average energy becomes evident. In a finite isolated system, average energy (above the ground state) can be identified with the maximum number of distinguishable states that can occur per unit of time. In a frame where the system has overall motion we see more energy and more distinct states, both due to the motion. The portion of the total relativistic energy that governs overall motion identifies a portion of the maximum number of distinct states (per unit time) distinguishable due to the motion. This in turn identifies the magnitude of average momentum as a count of possible states distinct due to overall motion, per unit distance. Similarly, other forms of energy are logical quantities that identify other maximum counts of distinct states.

I guess the summary is that, if nature is informational, then the natural quantities to use for analyzing nature are informational. We discovered entropy empirically, before realizing it was informational. The same is true of energetic quantities.

Comment #223 August 4th, 2017 at 12:46 pm

Hi Fred,

I feel like we may be talking about different things on this topic of whether injecting randomness into programs changes things.

Avi Widgerson gave a good talk about leveraging “randomness”–or maybe pseudorandomness/complexity is sufficient–for more efficient algorithms, for achieving some optimization, for game theory strategy, etc. But in the end, the result he talks about (not that I fully understand it, but intuitively it seems to make a lot of sense) is that “every efficient random algorithm has a deterministic counterpart” (assuming P doesn’t equal NP). This wasn’t exactly what I was thinking about when I wrote my last response (although I think there is a relationship), but maybe this relates to what you were saying?

https://www.youtube.com/watch?v=ZzsFb-6wvoE

I made a movie called “Digital Physics” which tries to explore some of these ideas around randomness vs. complexity. Maybe you’d like to check it out. If you leave a review of the movie (good or bad) I’ll send you a free pack of trading cards (with gum!). Scott, the same offer goes for you too 🙂

Comment #224 August 4th, 2017 at 2:46 pm

Tim #166: Sorry for the delayed response. Let me go line-by-line.

> I do disagree, and the same counterexample proves that not even a lower bound can be derived.

How so? In the example of a zero-momentum particle, the spatial variation is zero, so the claimed lower bound is zero, which fits with what you said.

> Daniel pulled out the Klein-Gordon equation, and if you look you will notice that the equation contains an “m”. That is the rest mass of the particle. And that, of course, lower bounds the energy of the state by E = mC^2.

You will also notice it contains a k (or a grad phi), which measures the degree of spatial variation, and which also lower bounds the energy of the state. This is the significant term for our purposes.

Comment #225 August 6th, 2017 at 6:19 am

Norm #222; Glad you liked the post!

But I think the danger is, it’s easy to round something like “nature is informational” down to a statement that lacks empirical content, and that’s true not just of our world but of any possible one. Thus, someone could say: really understanding any physical concept (mass, energy, charge…)

meansknowing what it corresponds to in terms of the basic data structures that comprise the state of the universe, so of course they’re going to look “informational” once you actually understand them.In the post, I tried to give a meaning for a converse claim—“information is physical”—that depends on special features of our world and is thus hopefully immune to that criticism.

Comment #226 August 9th, 2017 at 12:49 pm

Two fish meet in the ocean, one Wittgenstein-fish, the other a Godel-fish.

Says Godel to Wittgenstein: How’s the water?

Replies Wittgenstein: What’s water?

I.e, the real question is: Can symbols encapsulate all information? Or is there information that cannot be represented in symbols? That, of course, is equivalent to Russel’s set that contains all the sets that do not contain themselves: It merely shows the limitation of what symbols (aka language) can do.

Saying it differently:

The one assumption Godelian scientists make is that the universe’s attributes can be reflected in symbols, i.e.: in inkblots, screen-blips, and air-noises. Perhaps not all the universe’s attributes, but a great many of them.

But the Wittgensteinians say: No, we can only see the aspects of the universe that “language” can pick up. And “language” is merely the structure of our own brain. Therefore, we can only see the part of the universe’s structure that fit the structure of our brain.

But now comes a third fish: Call him Weigner. He sings (he is a singing fish) Helleluja to the marvelous predictive power of mathematics, that enables fish to forecast all water currents and colors and temperature. Isn’t it great that math works?

So now comes a fourth fish, a sort of a flying fish named Archie Wheeler (who once had been outside water), and says yeah, sure, I met once a fruit fly who marveled that the universe is built of pixels… All we do in science, (says Archie) we do in “language,” which are inkblots and airnoises and screenblips, based on signals of our six or seven meat-based sensors, as processed by a three pounds of meat-computer. To assume that the structure of these is the same as the structure of the universe may be nothing more than a tautology. Or, at best, it may be projections of the universe upon our meat sensor, same as the sphere projected itself upon the plain in Flatland…

Says the Godel fish: But just because we cannot find the flaws in language using language, does it mean we found all that language (and math, and science) can do?

So finally comes the Leviathan (nicknamed Hilbert), who booms: No. We can do more, and we will.

But just how much? Just how much “real information” (whatever that is) can be encapsulated in symbols, and how much information never can be? And is there any use of us even talking about it?

Well, says a crab called Kantor: you never know until you try.

Comment #227 August 9th, 2017 at 10:25 pm

I have an argument, which I wonder if it is correct:

1) Divide the entire universe into a region A and its complimentary region A’. An observer is located in region A’.

2) The observer attempts to erase information in A’.

3) Assuming that the total information content in the universe is constant, erasing information from A’ simply means dumping the information into region A.

4) According to Landauer’s principle, erasing information creates heat (unusable energy). So, heat is being created in A as the observer dumps information into A.

5) Assume that the heat stays inside region A.

6) According to GR, there is a limit to the amount of energy that can be stored in a region before it turns into a black hole.

7) For the region A’, total information is the sum of information in space (I’) and the information on its boundaries (I’_b). When the black hole is created, the total information content of the universe, according to the observer, is I’ – delta I + I’_b, where delta I is the information dumped into A.

8) To any observer in the universe, the total information is always constant: I’ – delta I + I’_b = I’.

This implies I’_b = delta I.

In short, information erased from A’ is stored on the boundary of region A, according to the observer.

Comment #228 August 9th, 2017 at 11:24 pm

I would like to add a few things in (7) and (8) to make things clearer:

7) For the region A’, total information is the sum of information in space (I’) and the information on its total boundaries (I’_b).

When the black hole is created in region A due to the heat energy, the total information content of the universe, according to the observer, is now I’ + I’_b_new – delta I’, where delta I’ is the information dumped into A.

I’_b_new is the total information on the new set of boundaries (which includes the newly formed BH boundary).

8) To any observer in the universe, the total information is always constant: I’ + I’_b_new – delta I’ = I’ + I’_b.

This implies I’_b_new – I’_b = delta I’. But I’_b_new – I’_b is just the information on the black hole boundary.

Comment #229 August 10th, 2017 at 11:50 am

My comment #2

Scott #225 How much of the informational character of physics is special to our world? An obvious way to answer this is to start with the question of how much isn’t. The most direct generalization of the classical analysis of finite-state computation involves counting distinct states, and this makes quantities such as entropy and energy generic. But once we talk about their functional dependence on physical degrees of freedom, they become very specific to our physics. Electromagnetic energy. Black hole entropy. Moreover, informational reinterpretations suggest new physics, so this approach has plenty of empirical content.

Of course some aspects of the informational character of macroscopic physics can be inferred indirectly. The extensive nature of entropy implies the existence of at least nearly-identical particles. The finiteness of heat capacities implies material systems are not composed of infinitely divisible stuff. But it is quantum mechanics that makes the informational character of physics directly observable.

I find the idea of an energy eigenstate particularly remarkable, informationally. An atom has an enormous number of possible internal configurations, especially if we include all of the degrees of freedom of its components down to some ultimate scale. And yet we can prepare an atom in an energy eigenstate, where all of the internal detail is absent. We can even use such atoms as qubits, so bereft of other information are they. Nothing like this is possible classically.

So it seems to me that the physical realizability of energy eigenstates makes physical systems with small amounts of information directly accessible to observation and experimentation, and might someday make the exact informational dynamics of our world observable.

Comment #230 August 12th, 2017 at 3:45 pm

1) The log of the wave function psi gives a complex expression with the real part proportional to information, and the imaginary part proportional to action.

ln(psi)=info+i.action

This is the mathematical sense of the reality of information.

2) When this substitution is made into the Schr. eqn. the real and imaginary parts result in two equations. One is a Hamilton-Jacobi eqn and the other expresses the conservation of information wrt time.

Conservation is the physical sense of reality.

3) The “hard” problem of psychology, namely what conscious sensation is made from, is the same stuff, information.

This is the psychological sense of reality.

It seems there is nothing more real than information!

Comment #231 August 17th, 2017 at 6:53 pm

1.) When viewing physics through the lens of “information”, how much is known about the rules of “computation”? Do researchers even use this analogy/ domain-tranformation instead of thinking in terms of standard events?

2) Is there a physical limit on how much information may participate in a single computation?

3) Is there even a possibility for simple events that only involve a few bits or information occurring in our universe? — If you can name one — wouldn’t a skeptic say that actually every single particle in the universe that has warped the gravitational field has participated in some way, and contributed atleast some fractional bits?

4) This is just a comment, but if there were a limit on the number of bits that could participate in a computation, I think that could be related to the phenomenon of us having a certain number of space-time dimensions. Like binary functions could exist on a 1-D line world, ternary functions on a 2-D plane, etc.