Now if you look at a generic material you can use “naturalness” to figure out that light fermions can describe quite generic materials (metals!) since they don’t require fine tuning, but light bosons only should happen if we fine tune microscopic constants (at phase transitions, where you dialed at least one parameter to a special value). And this beautifully agrees with experiment. Making this more quantitative you arrive at Fermi liquid theory and Landau’s theory of phase transitions, the cornerstones of modern condensed matter physics. So at least in one area of physics (an area much larger than all of particle physics for sure) “naturalness” is stunningly successful.

Now as far as particle physics is concerned we have the problem that we only have one world, so we can not in fact dial the microscopic parameters. But it was certainly not unreasonable to imagine that maybe naturalness should apply in this case to. It’s not an open and shut case, never was. But it’s a working hypothesis whose experimental consequences should be worked out and tested. That program looks like it is pretty much complete and gave an answer (modulo loopholes). But that doesn’t make naturalness retroactively a bad idea, and doesn’t take away any of its stunning successes in other areas of physics.

]]>The basic rough story is this. We measure the Higgs mass. We can assume that the Standard Model is good up to some energy near the Planck energy, after which it fizzles out for some unspecified reason.

According to the Standard Model, each of the 25 fundamental constants appearing in the Standard Model is a “running coupling constant”. That is, it’s not really a constant, but a function of energy: roughly the energy of the process we use to measure that process. Let’s call these “coupling constants measured at energy E”. Each of these 25 functions is determined by the value of all 25 functions at any fixed energy E – e.g. energy zero, or the Planck energy. This is called the “renormalization group flow”.

So, the Higgs mass we measure is actually the Higgs mass at some energy E quite low compared to the Planck energy.

And, it turns out that to get this measured value of the Higgs mass, the values of some fundamental constants measured at energies near the Planck mass need to almost cancel out. More precisely, some complicated function of them needs to almost but not quite obey some equation.

People summarize the story this way: to get the observed Higgs mass we need to “fine-tune” the fundamental constants’ values as measured near the Planck energy, if we assume the Standard Model is valid up to energies near the Planck energy.

A lot of particle physicists accept this reasoning and additionally assume that fine-tuning the values of fundamental constants as measured near the Planck energy is “bad”. They conclude that it would be “bad” for the Standard Model to be valid up to the Planck energy.

(In the previous paragraph you can replace “bad” with some other word – for example, “implausible”.)

Indeed you can use a refined version of the argument I’m sketching here to say “either the fundamental constants measured at energy E need to obey an identity up to precision ε or the Standard Model must break down before we reach energy E”, where ε gets smaller as E gets bigger.

Then, in theory, you can pick an ε and say “an ε smaller than that would make me very nervous.” Then you can conclude that “if the Standard Model is valid up to energy E, that will make me very nervous”.

(But I honestly don’t know anyone who has approximately computed ε as a function of E. Often people seem content to hand-wave.)

People like to argue about how small an ε should make us nervous, or even whether *any* value of ε should make us nervous.

But another assumption behind this whole line of reasoning is that the values of fundamental constants as measured at some energy near the Planck energy are “more important” than their values as measured near energy zero, so we should take near-cancellations of these high-energy values seriously – more seriously, I suppose, than near-cancellations at low energies.

Most particle physicists will defend this idea quite passionately. The philosophy seems to be that God designed high-energy physics and left his grad students to work out its consequences at low energies – so if you want to understand physics, you need to focus on high energies.

]]>It merely says that, when we find basic parameters of nature to be precariously balanced against each other, to one part in 1010 or whatever, there’s almost certainly some explanation.

Do you know examples of this sort of situation in particle physics, or is this just a hypothetical situation?

]]>The FCC will start with what the ILC aims to do, e+e- collisions around the Higgs

“The FCC examines scenarios for three different types of particle collisions: hadron (proton-proton and heavy ion) collisions, like in the LHC; electron-positron collisions, as in the former LEP; and proton-electron collisions.”

https://home.cern/science/accelerators/future-circular-collider

As happened with LEP, where the energy region explored was up to 210 GeV , and in the same tunnel the LHC was introduced. Conservation of tunnels :).

]]>your argument against Dr. Hossenfelder’s position reminds me of a story:

Elementary school.

Teacher: “Bobby, if you have 6 apples and divide them up with your brother, how much do you have then?”

Bobby: “One.”

Teacher: “Looks to me like you don’t know mathematics.”

Bobby: “Looks to me like you don’t know my brother.”

……………..

]]>