> Nature and the universe is all about continuous quantities
One could argue that nature always deals in discrete quantities and we have models that accurately predict these quantities. Then we use math that humans clearly created (limits) to produce similar models, except they imagine continuous inputs.
The quantity of matter and the quantity of electricity are discrete, but work, time and space are continuous, like also any quantities derived from them.
There have been attempts to create discrete models of time and space, but nothing useful has resulted from those attempts.
Most quantities encountered in nature include some dependency on work/energy, time or space, so nature deals mostly in continuous quantities, or more precisely the models that we can use to predict what happens in nature are still based mostly on continuous quantities, despite the fact that about a century and a half have passed since the discreteness of matter and electricity has been confirmed.
> but work, time and space are continuous
I'm under the impression that all our theories of time and space (and thus work) break down at the scale of 1 plank unit and smaller. Which isn't proof that they aren't continuous, but I don't see how you could assert that they are either.
Matter and energy are discrete. The continuity or discreteness of time and space are unknown. There are arguments for both cases, but nobody really knows for sure.
It’s fairly easy to go from integers to many subsets of the reals (rationals are straightforward, constructible numbers not too hard, algebraic numbers more of a challenge), but the idea that the reals are, well real, depends on a continuity of spacetime that we can’t prove exists.
Energy is continuous, not discrete.
Because energy is action per time, it inherits the continuity of time. Action is also continuous, though its nature is much less well understood. (Many people make confusions between action and angular momentum, speaking about a "quantum of action". There is no such thing as a quantum of action, because action is a quantity that increases monotonically in time for any physical system, so it cannot have constant values, much less quantized values. Angular momentum, which is the ratio of action per phase in a rotation motion, is frequently a constant quantity and a quantized quantity. In more than 99% of the cases when people write Planck's constant, they mean an angular momentum, but there are also a few cases when people write Planck's constant meaning an action, typically in relation with some magnetic fluxes, e.g. in the formula of the magnetic flux quantum.)
Perhaps when you said that energy is discrete you thought about light being discrete, but light is not energy. Energy is a property of light, like also momentum, frequency, wavenumber and others.
Moreover, the nature of the photon is still debated. Some people are not convinced yet that light travels in discrete packets, instead of the alternative where only the exchange of energy and momentum between light and electrons or other leptons and quarks is quantized.
There are certain stationary systems, like isolated atoms or molecules, which may have a discrete set of states, where each state has a certain energy.
Unlike for a discrete quantity like the electric charge, such sets of energy values can contain arbitrary values of energy and between the sets of different systems there are no rational relationships between the energy values. Moreover, all such systems have not only discrete energy values but also continuous intervals of possible energies, usually towards higher energies, e.g. corresponding to high temperatures or to the ionization of atoms or molecules.
The Planck units are bogus units that do not have any significance.
Perhaps our theories of time and space would break down at some extremely small scale, but for now there is no evidence about this and nobody has any idea which that scale may be.
In the 19th century, both George Johnstone Stoney and Max Planck have made the same mistake. Each of them has computed for the first time some universal constants, Stoney has computed the elementary electric charge in 1874 and Planck has computed the 2 constants that are now named "Boltzmann's constant" and "Planck's constant", in several variants, in 1899, 1900 and 1901. (Ludwig Boltzmann had predicted the existence of the constant that bears his name, but he never used it for anything and he did not compute its value.)
Both of them have realized that new universal constants allow the use of additional natural units in the system of fundamental units of measurement and they have attempted to exploit their findings for this purpose.
However both have bet on the wrong horse. Before them, James Clerk Maxwell had proposed two alternatives for choosing a good unit of mass. The first was to choose as the unit of mass the mass of some molecule. The second was to give an exact value to the Newtonian constant of gravity. The first Maxwell proposal was good and when analyzed at the revision of SI from 2018 it was only very slightly worse than the final choice (which preferred to use two properties of the photons, instead of choosing an arbitrary molecule besides using one property of the photons).
The second Maxwell proposal was extremely bad, though to be fair it was difficult for Maxwell to predict that during the next century the precision of measuring many quantities will increase by many orders of magnitude, while the precision of measuring the Newtonian constant of gravity will be improved only barely, in comparison with the others.
Both Stoney and Planck have chosen to base their proposals for systems of fundamental units on the second Maxwell variant, and this mistake made their systems completely impractical. The value of Newton's constant has a huge uncertainty in comparison with the other universal constants. Declaring its value as exact does not make that uncertainty disappear, but it moves the uncertainty into the values of almost all other physical quantities.
The consequence is that if using the systems of fundamental units of George Johnstone Stoney or of Max Planck, almost no absolute value of any quantity can be known accurately. Only the ratios between two quantities of the same kind and the velocities can be known accurately.
Thus the Max Planck system of units is a historical curiosity that is irrelevant for practice. The right way to use Planck's constant in a system of units has become possible only 60 years later, when the Josephson effect was predicted in 1962, and SI has been modified to use it only after other 60 years, in 2019.
The units of measurement that are chosen to be fundamental do not matter in any way upon the validity of physical laws at different scales. Even if the Planck units were practical, that would give no information about the structure of space and time. The definition of the Planck units is based on continuous models for time, space and forces.
Every now and then there are texts in the popular literature that mention the Planck units as they would have some special meaning. All such texts are based on hearsay, repeating affirmations from sources who have no idea about how the Planck units have been defined in 1899 and about how systems of fundamental units of measurement are defined and what they mean. Apparently the only reason why the Planck units have been picked for this purpose is that in this system the unit of length happens to be much smaller than an atom or than its nucleus, so people imagine that if the current model of space breaks at some scale, that scale might be this small.
The Planck length is at least around the right order of magnitude for things to get weird. If you have the position uncertainty of something be less that ~ a Planck length, and it’s expected momentum equal to zero, by Heisenberg position momentum uncertainty, the expectation of the square of the momentum is big enough that the (relativistic) kinetic energy is big enough that the Schwartzchild radius is also around the Planck length iirc?
The right magnitude for things to get weird must be very small, but nobody can say whether that scale is a million times greater than the Planck length or a million times smaller than the Planck length.
Therefore using the Planck length for any purpose is meaningless.
For now, nobody can say anything about the value of a Schwartzschild radius in this range, because until now nobody succeeded to create a theory of gravity that is valid at these scales.
We are not even certain whether Einstein's theory of gravity is correct at galaxy scales (due to the discrepancies non-explained by "dark" things), much less about whether it applies at elementary particle scales.
The Heisenberg uncertainty relations must always be applied with extreme caution, because they are valid in only in limited circumstances. As we do not know any physical system that could have dimensions comparable with the Planck length, we cannot say whether it might have any stationary states that could be characterized by the momentum-position Heisenberg uncertainty, or by any kind of momentum. (My personal opinion is that the so-called elementary particles, i.e. the leptons and the quarks, are not point-like, but they have a spatial extension that explains their spin and the generations of particles with different masses, and their size is likely to be greater than the Planck length.)
So attempting to say anything about what happens at the Planck length or at much greater or much smaller scales, but still much below of what can be tested experimentally, is not productive, because it cannot reach any conclusion.
In any case, using "Planck length" is definitely wrong, because it gives the impression that there are things that can be said about a specific length value, while everything that has ever been said about the Planck length could be said about any length smaller than we can reach by experiments.
By “things get weird” I meant “our current theories/models predict things to get weird”.
So, like, I’m saying that if Einstein’s model of gravity is applicable at very tiny scales, and if the [p,x] relation continues to hold at those scales, then stuff gets weird (either by “measurement of any position to within that amount of precision results in black-hole-ish stuff”, OR “the models we have don’t correctly predict what would happen”)
Now, it might be that our current models stop being approximately accurate at scales much larger than the Planck scale (so, much before reaching it), but either they stop being accurate at or before (perhaps much before) that scale, or things get weird at around that scale.
Edit: the spins of fermions don’t make sense to attribute to something with extent spinning. The values of angular momentum that you get for an actual spinning thing, and what you get for the spin angular momentum for fermions, are offset by like, hbar/2.
I get what you mean, but one thing about which we are certain is that you cannot apply Einstein"s model of gravity at these scales, because his theory is only an approximation that determines the metric of space from an averaged density of the energy and momentum of matter, not from the energy-momentum 4-vectors of the particles that compose matter.
So Einstein's theory depends in an essential way on matter being continuous. This is fine at human and astronomic scales, but it is not applicable at molecular or elementary particle scales, where you cannot approximate well the particles by an averaged density of their energy and momentum.
Any attempt to compute a gravitational escape velocity at scales many orders of magnitude smaller than the radius of a nucleus are for now invalid and purposeless.
The contradiction between the continuity of matter supposed by Einstein's gravity model and the discreteness of matter used in quantum physics is great enough that during more than a century of attempts they have not been reconciled in an acceptable way.
The offset of the spin is likely to be caused by the fact that for particles of non-null spin their movement is not a simple spinning, but one affected by some kind of precession, and the "spin" is actually the ratio between the frequencies of the 2 rotation movements, which is why it is quantized.
The "action" is likely to be the phase of the intrinsic rotation that affects even the particles with null spin (and whose frequency is proportional with their energy), while those with non-null spin have also some kind of precession superposed on the other rotation.
> The offset of the spin is likely to be caused by the fact that for particles of non-null spin their movement is not a simple spinning, but one affected by some kind of precession, and the "spin" is actually the ratio between the frequencies of the 2 rotation movements, which is why it is quantized.
I don’t expect this to work. For one thing, we already know the conditions under which the spin precesses. That’s how they measure g-2 .
Also, orbital angular momentum is already quantized. So, I don’t know why you say that the “precession” is responsible for the quantized values for the spin.
the representations of SU(2) for composite particles, combine in understood ways, where for a combination of an even number of fermions, the possible total spin values match up with the possible values for orbital angular momentum.
Could you give an explanation for how you think precession could cause this difference? Because without a mathematical explanation showing otherwise, or at least suggesting otherwise, my expectation is going to be that that doesn’t work.
The orbital angular momentum is quantized for the same reason as the spin, both are ratios between the phases of 2 separate rotation movements, the orbital rotation or the spin rotation and the intrinsic rotation corresponding to the de Broglie wave (whose phase is proportional to Hamilton's integral, i.e. the integral of the Lagrangian over time).
I have used "precession" for lack of a better term for suggesting its appearance, because while there is little doubt about the existence of 2 separate kinds of rotations in the particles with non-null spin, there exists no complete model of how they are combined.
We do not know whether work time and space are continuous
What we know is that we use mathematical models based on the continuity of work, time and space (and on the discreteness of matter and electricity) and until now we have not seen any experiment where a discrepancy between predicted and measured values could be attributed to the falseness of the supposition that work, time and space are continuous.
Obviously this does not exclude the possibility that in the future some experiments where much higher energies per particle are used, allowing the testing of what happens at much smaller distances, might show evidence that there exists a discrete structure of time and space, like we know for matter.
However, that has not happened yet and there are no reasons to believe that it will happen soon. The theory about the existence of atoms is more than 2 millennia old, then it has been abandoned for lack of evidence, then it was revived at the beginning of the 19th century, due to accumulated evidence from chemistry, and it was eventually confirmed beyond doubt in 1865, when Johann Josef Loschmidt became the first who could count atoms and molecules, after determining their masses.
So the discreteness of matter had a very long history of accumulating evidence in favor of it.
Nothing similar applies to the discreteness of time and space, for which there has never been any kind of evidence. The only reason of the speculations about this is the analogy made with the fact that matter and electricity had been believed to be continuous, but eventually it has been discovered that they are discrete.
Such an analogy must make us keep an open mind about the possibility of work, time and space being discrete, but we should not waste time speculating about this when there are huge problems in physics that do not have a solution yet. In modern physics there are a huge amount of quantities that should be computable by theory, but in fact they cannot be computed and they must be measured experimentally. Therefore the existing theories are clearly not good enough.
Umm SpaceTime is likely NOT to be fundamental or continuous
https://youtu.be/GL77oOnrPzY?si=nllkY_E8WotARwUM
Also Bells Therom implies no locality or non realism which to me furthers the nail on the coffin of spacetime
That presentation is like all the research that has been published in this domain, i.e. it presents some ideas that might be used to build an alternative theory of space-time, but no such actual theories.
There are already several decades of such discussions, but no usable results.
Time and space are primitive quantities in any current theory of physics, i.e. quantities that are assumed to exist and have certain properties, and which are used to define derived quantities.
Any alternative theory must start by enumerating exactly which are its primitive quantities and which are their properties. Anything else is just gibberish, not better than Star Trek talk.
However, the units of measurement for time and length are not fundamental units a.k.a. base units, because it is impossible to make any physical system characterized by values of time or length that are stable enough and reproducible enough.
Because of that, the units of time and length are derived from fundamental units that are units of some derived quantities, currently from the units of work and velocity (i.e. the unit of work is the work required to transition a certain atom, currently cesium 133, from a certain state to a certain other state, i.e. which is equal to the difference between the energies of the 2 states, while the unit of velocity is the velocity of light in vacuum).