The answer to your quest for a physical law that will explain those observation about computational models is the law of least action.

where action = physics computational complexity.

It is clear why is should be mathematical time complexity as it counts “actions” of computation machine and not actual time !

Action is not always minimal can exlpain the problem in solving NPC problems !

http://www.eftaylor.com/pub/Gray&TaylorAJP.pdf ]]>

“In comparing Heartbleed to hypothetical bugs in a quantum computer, I feel like you’re conflating two extremely different issues. As a general rule, it’s about a trillion times easier to write a correct program—whether classical or quantum!—when the program’s only purpose is to solve some fixed, well-defined math problem, rather than interfacing with a wide array of human users in a way that’s secure against any of those users who might be malicious and exploit vulnerabilities in the code!”

Actually, this conflation was made on purpose. As Poincaré used to say, mathematics is the art of giving different things the same name. It seemed indeed interesting to me to compare an attempt to solve a fixed, well-defined math problem by quantum means, with an attempt to solve some real-world security issue by classical means. In both cases, elements of uncertainty are at work so that a plethora of malicious users could play the role of the fundamental indeterminacy of quantum mechanics. In such a view, the difficulties encountered in building a quantum computer appear as a consequence of some general principal one might call “the uncertainty of the fight against uncertainty”. It bears some ressemblance with the (possible) unprovability of the unprovability of P≠NP.

]]>In any case, assuming I am wrong, perhaps there is room for some careful measurement of this phenomenon with very small matter clusters, gradually increasing their size (and maybe varying their rigidity) to see where the finickiness of the initial conditions sets in. It’s relatively easy (there’s a wide range of initial conditions) to get two fundamental particles to reverse a decay process or a collision that led to fragmentation. It’s almost impossible to do it with a storm window. Somewhere in between is the knee of the finickiness curve.

]]>Keep in mind, however, that you would need to perfectly reverse not merely the motions of the glass molecules, but also the motions of all the photons that escaped, all the motions that those photons caused after getting absorbed by electrons somewhere else, etc. If you time-reverse *everything* (well, and also reverse left and right in weak decays, and reverse matter and antimatter, in the unlikely event that either of those is relevant 🙂 ), then you’ll certainly obtain another valid solution of the equations of physics. It will just be one with an extremely bizarre initial condition, one that has the bizarre property of causing a pristine glass to form out of fragments.

Obviously, there must be some threshold size of object where this phenomenon would first set in, because single-particle processes are observed to be reversible. I don’t know if it’s the atomic, the molecular, the ten-molecule, etc. level where this happens. But it might be more instructive to focus on this sort of specific material system rather than generalized abstract systems to get some intuition about the mechanisms involved in irreversibility.

]]>*That’s true, but presents a false dichotomy. One strategy for writing secure code is to formalize the requirements. Ideally in a DSL that generates the actual code. That doesn’t address most side channel attacks (timing, compression,…) but it does address many other attacks, such as Heartbleed and the recent validation problems. cf langsec. (All of which is completely orthogonal to your point.)*

The distribution of wave function in QM as far as i know are from -inf to inf , only the probability is always positive or zero

( absolute value of wave function squared ).

So i don’t see any theoretical reason for wave function first moment is always integrable by absolute value nor second moment.

Anyway, it is possible for positive values only distribution to have no second moment and do have first moment

( meaning variance is infinite ).

for example, for X=1/sqrt(U) where U is uniform distribution.

It has density function of f(x)=2*x^(-3) ,X>=1.

Var(X) = infinity, but E(X) =2 .

But in QM the probability distribution is from the square of the modulus of the amplitude. So the distribution is always positive, no? I don’t think it’s possible to get the moments you suggest with strictly positive values, no? (I could be wrong, I’m no expert) ]]>

I know in physics we normalize wave function, otherwise it was not legal distribution function.

I talk about different problem,

strong law of large numbers ( what assure us average as n->inf equals expectation) does not hold for all distributions (when first moment does not converges surely ), also some distribution have first moment but infinite variance ( second moment not converges surely ) .

my question is whether wave function can act like this ?

if it is physically possible what does it mean for Uncertainty principle .

if it’s not physically possible wave function , does someone take this into account, so we can add some more constraints for legal wave function?

( i know Cauchy and Levy distribution are is in use in physics world i have no idea if it has any connection to wave function , i have numerous other distribution examples )