> but work, time and space are continuous

I'm under the impression that all our theories of time and space (and thus work) break down at the scale of 1 plank unit and smaller. Which isn't proof that they aren't continuous, but I don't see how you could assert that they are either.

Matter and energy are discrete. The continuity or discreteness of time and space are unknown. There are arguments for both cases, but nobody really knows for sure.

It’s fairly easy to go from integers to many subsets of the reals (rationals are straightforward, constructible numbers not too hard, algebraic numbers more of a challenge), but the idea that the reals are, well real, depends on a continuity of spacetime that we can’t prove exists.

Energy is continuous, not discrete.

Because energy is action per time, it inherits the continuity of time. Action is also continuous, though its nature is much less well understood. (Many people make confusions between action and angular momentum, speaking about a "quantum of action". There is no such thing as a quantum of action, because action is a quantity that increases monotonically in time for any physical system, so it cannot have constant values, much less quantized values. Angular momentum, which is the ratio of action per phase in a rotation motion, is frequently a constant quantity and a quantized quantity. In more than 99% of the cases when people write Planck's constant, they mean an angular momentum, but there are also a few cases when people write Planck's constant meaning an action, typically in relation with some magnetic fluxes, e.g. in the formula of the magnetic flux quantum.)

Perhaps when you said that energy is discrete you thought about light being discrete, but light is not energy. Energy is a property of light, like also momentum, frequency, wavenumber and others.

Moreover, the nature of the photon is still debated. Some people are not convinced yet that light travels in discrete packets, instead of the alternative where only the exchange of energy and momentum between light and electrons or other leptons and quarks is quantized.

There are certain stationary systems, like isolated atoms or molecules, which may have a discrete set of states, where each state has a certain energy.

Unlike for a discrete quantity like the electric charge, such sets of energy values can contain arbitrary values of energy and between the sets of different systems there are no rational relationships between the energy values. Moreover, all such systems have not only discrete energy values but also continuous intervals of possible energies, usually towards higher energies, e.g. corresponding to high temperatures or to the ionization of atoms or molecules.

[deleted]

The Planck units are bogus units that do not have any significance.

Perhaps our theories of time and space would break down at some extremely small scale, but for now there is no evidence about this and nobody has any idea which that scale may be.

In the 19th century, both George Johnstone Stoney and Max Planck have made the same mistake. Each of them has computed for the first time some universal constants, Stoney has computed the elementary electric charge in 1874 and Planck has computed the 2 constants that are now named "Boltzmann's constant" and "Planck's constant", in several variants, in 1899, 1900 and 1901. (Ludwig Boltzmann had predicted the existence of the constant that bears his name, but he never used it for anything and he did not compute its value.)

Both of them have realized that new universal constants allow the use of additional natural units in the system of fundamental units of measurement and they have attempted to exploit their findings for this purpose.

However both have bet on the wrong horse. Before them, James Clerk Maxwell had proposed two alternatives for choosing a good unit of mass. The first was to choose as the unit of mass the mass of some molecule. The second was to give an exact value to the Newtonian constant of gravity. The first Maxwell proposal was good and when analyzed at the revision of SI from 2018 it was only very slightly worse than the final choice (which preferred to use two properties of the photons, instead of choosing an arbitrary molecule besides using one property of the photons).

The second Maxwell proposal was extremely bad, though to be fair it was difficult for Maxwell to predict that during the next century the precision of measuring many quantities will increase by many orders of magnitude, while the precision of measuring the Newtonian constant of gravity will be improved only barely, in comparison with the others.

Both Stoney and Planck have chosen to base their proposals for systems of fundamental units on the second Maxwell variant, and this mistake made their systems completely impractical. The value of Newton's constant has a huge uncertainty in comparison with the other universal constants. Declaring its value as exact does not make that uncertainty disappear, but it moves the uncertainty into the values of almost all other physical quantities.

The consequence is that if using the systems of fundamental units of George Johnstone Stoney or of Max Planck, almost no absolute value of any quantity can be known accurately. Only the ratios between two quantities of the same kind and the velocities can be known accurately.

Thus the Max Planck system of units is a historical curiosity that is irrelevant for practice. The right way to use Planck's constant in a system of units has become possible only 60 years later, when the Josephson effect was predicted in 1962, and SI has been modified to use it only after other 60 years, in 2019.

The units of measurement that are chosen to be fundamental do not matter in any way upon the validity of physical laws at different scales. Even if the Planck units were practical, that would give no information about the structure of space and time. The definition of the Planck units is based on continuous models for time, space and forces.

Every now and then there are texts in the popular literature that mention the Planck units as they would have some special meaning. All such texts are based on hearsay, repeating affirmations from sources who have no idea about how the Planck units have been defined in 1899 and about how systems of fundamental units of measurement are defined and what they mean. Apparently the only reason why the Planck units have been picked for this purpose is that in this system the unit of length happens to be much smaller than an atom or than its nucleus, so people imagine that if the current model of space breaks at some scale, that scale might be this small.

The Planck length is at least around the right order of magnitude for things to get weird. If you have the position uncertainty of something be less that ~ a Planck length, and it’s expected momentum equal to zero, by Heisenberg position momentum uncertainty, the expectation of the square of the momentum is big enough that the (relativistic) kinetic energy is big enough that the Schwartzchild radius is also around the Planck length iirc?

The right magnitude for things to get weird must be very small, but nobody can say whether that scale is a million times greater than the Planck length or a million times smaller than the Planck length.

Therefore using the Planck length for any purpose is meaningless.

For now, nobody can say anything about the value of a Schwartzschild radius in this range, because until now nobody succeeded to create a theory of gravity that is valid at these scales.

We are not even certain whether Einstein's theory of gravity is correct at galaxy scales (due to the discrepancies non-explained by "dark" things), much less about whether it applies at elementary particle scales.

The Heisenberg uncertainty relations must always be applied with extreme caution, because they are valid in only in limited circumstances. As we do not know any physical system that could have dimensions comparable with the Planck length, we cannot say whether it might have any stationary states that could be characterized by the momentum-position Heisenberg uncertainty, or by any kind of momentum. (My personal opinion is that the so-called elementary particles, i.e. the leptons and the quarks, are not point-like, but they have a spatial extension that explains their spin and the generations of particles with different masses, and their size is likely to be greater than the Planck length.)

So attempting to say anything about what happens at the Planck length or at much greater or much smaller scales, but still much below of what can be tested experimentally, is not productive, because it cannot reach any conclusion.

In any case, using "Planck length" is definitely wrong, because it gives the impression that there are things that can be said about a specific length value, while everything that has ever been said about the Planck length could be said about any length smaller than we can reach by experiments.

By “things get weird” I meant “our current theories/models predict things to get weird”.

So, like, I’m saying that if Einstein’s model of gravity is applicable at very tiny scales, and if the [p,x] relation continues to hold at those scales, then stuff gets weird (either by “measurement of any position to within that amount of precision results in black-hole-ish stuff”, OR “the models we have don’t correctly predict what would happen”)

Now, it might be that our current models stop being approximately accurate at scales much larger than the Planck scale (so, much before reaching it), but either they stop being accurate at or before (perhaps much before) that scale, or things get weird at around that scale.

Edit: the spins of fermions don’t make sense to attribute to something with extent spinning. The values of angular momentum that you get for an actual spinning thing, and what you get for the spin angular momentum for fermions, are offset by like, hbar/2.

I get what you mean, but one thing about which we are certain is that you cannot apply Einstein"s model of gravity at these scales, because his theory is only an approximation that determines the metric of space from an averaged density of the energy and momentum of matter, not from the energy-momentum 4-vectors of the particles that compose matter.

So Einstein's theory depends in an essential way on matter being continuous. This is fine at human and astronomic scales, but it is not applicable at molecular or elementary particle scales, where you cannot approximate well the particles by an averaged density of their energy and momentum.

Any attempt to compute a gravitational escape velocity at scales many orders of magnitude smaller than the radius of a nucleus are for now invalid and purposeless.

The contradiction between the continuity of matter supposed by Einstein's gravity model and the discreteness of matter used in quantum physics is great enough that during more than a century of attempts they have not been reconciled in an acceptable way.

The offset of the spin is likely to be caused by the fact that for particles of non-null spin their movement is not a simple spinning, but one affected by some kind of precession, and the "spin" is actually the ratio between the frequencies of the 2 rotation movements, which is why it is quantized.

The "action" is likely to be the phase of the intrinsic rotation that affects even the particles with null spin (and whose frequency is proportional with their energy), while those with non-null spin have also some kind of precession superposed on the other rotation.

> The offset of the spin is likely to be caused by the fact that for particles of non-null spin their movement is not a simple spinning, but one affected by some kind of precession, and the "spin" is actually the ratio between the frequencies of the 2 rotation movements, which is why it is quantized.

I don’t expect this to work. For one thing, we already know the conditions under which the spin precesses. That’s how they measure g-2 .

Also, orbital angular momentum is already quantized. So, I don’t know why you say that the “precession” is responsible for the quantized values for the spin.

the representations of SU(2) for composite particles, combine in understood ways, where for a combination of an even number of fermions, the possible total spin values match up with the possible values for orbital angular momentum.

Could you give an explanation for how you think precession could cause this difference? Because without a mathematical explanation showing otherwise, or at least suggesting otherwise, my expectation is going to be that that doesn’t work.

The orbital angular momentum is quantized for the same reason as the spin, both are ratios between the phases of 2 separate rotation movements, the orbital rotation or the spin rotation and the intrinsic rotation corresponding to the de Broglie wave (whose phase is proportional to Hamilton's integral, i.e. the integral of the Lagrangian over time).

I have used "precession" for lack of a better term for suggesting its appearance, because while there is little doubt about the existence of 2 separate kinds of rotations in the particles with non-null spin, there exists no complete model of how they are combined.

[deleted]