Your perspective is incorrect.

Physical entropy governs real physical processes. Simple example: why ice melts in a warm room. More subtle example: why cords get tangled up over time.

Our measures of entropy can be seen as a way of summarizing, at a macro level, the state of a system such as that warm room containing ice, or a tangle of cables, but the measure is not the same thing as the phenomenon it describes.

Boltzmann's approach to entropy makes the second law pretty intuitive: there are far more ways for a system to be disordered than ordered, so over time it tends towards higher entropy. That’s why ice melts in a warm room.

My take, for what it's worth,

Entropy isn’t always the driver of physical change, sometimes it’s just a map.

Sometimes that map is highly isomorphic to the physical process, like in gas diffusion or smoke dispersion. In those cases, entropy doesn't just describe what happened, it predicts it. The microstates and the probabilities align tightly with what’s physically unfolding. Entropy is the engine.

But other times, like when ice melts, entropy is a summary, not a cause. The real drivers are bond energies and phase thresholds. Entropy increases, yes, but only because the system overcame physical constraints that entropy alone can’t explain. In this case, entropy is the receipt, not the mechanism.

So the key idea is this: entropy’s usefulness depends on how well it “sees” the real degrees of freedom that matter. When it aligns closely with the substrate, it feels like a law. When it doesn't, it’s more like coarse bookkeeping after the fact.

The second law of thermodynamics is most “real” when entropy is the process. Otherwise, it’s a statistical summary of deeper physical causes.

What makes entropy interesting is that you can describe many physical processes through analysis of the systems degrees of freedom. This pattern repeats regularly despite the systems being radically different.

So you can interpret entropy as being about as real as potential energy or newtons laws. Very useful for calculation, subject to evolution laws which are common across all systems - but potentially gives way as an approximation under a finer grained view (although the finer grained view is also subject to the same rules)

Just swap out "my take, for what it's worth" For "according to ChatGPT"

I don't like your comment, it's pretty low effort, and your tearing down a lot of effort that I have put into learning about physics and entropy over the years.

I came up with the psuedonym Aeonik Chaos circa 2010 because my studies of Chaos Theory and Entropy left a very deep impact on me.

I've been thinking about entropy ever since my dad, a biochemist, embryologist, and doctor told me he thought that entropy was behind the secret of life. That event was over 30 years ago.

I've been thinking about this stuff very deeply for a very long time.

So no, it's not according to ChatGPT, it's just my current take, but I don't to come out and say "this is how it is", because entropy has a way of being sneaky, I've been wrong about theories of entropy way too many times to pretend to be some authority on the matter.

Hence, this is my take, take it or leave it.

By the way, I love using ChatGPT to learn more about science and math, and have it act as sounding board for ideas. It's great for learning and diving deeper into this stuff.

I know it seems low effort but I spent a good ten minutes reading through your other comments looking for an explanation for your post. It reads as so obviously written by an LLM.

I have no doubt you're a real person, and that all those things you claim are true. Almost all of your posts are clearly written by a thoughtful human.

But then I see that occasionally, you post something that looks nothing like your other posts, and exactly like it was written by an LLM. I love learning from ChatGPT too, but I'm curious about your mindset if and when you have it speak for you here. Is the goal to use ChatGPT to educate us? If so, why not give it credit?

Everyone here knows how to use google and ChatGPT (and a majority of us could whip up a script to auto-post LLM responses). It seems like most of the internet is already "dead".. don't give it HN too, please, stick to posting thoughtful human stuff!

I use it to edit some of my posts, and clean up grammar, or to help articulate thoughts that I don't have time to give a proper third or fourth pass to.

I'm very busy these days, and will often use ChatGPT as an editor.

I'll often drop a rough draft into ChatGPT and have it clean it up, or ask or for other perspectives I might be missing.

For the comment in question, I had a spidey sense activated from the conversation. And ChatGPT helped me narrow my focus down to the different levels that entropy apply to, though I had push back on it a bit before it got to the meat of my thoughts.

I think of it as a sort of translator, it helps communicate vibes that the Striatum part of my brain is feeling/sensing, but that the verbal or symbolic part of my brain hasn't figured out to say or express yet.

Normally this process can take a while, and ChatGPT remarkably speeds this cycle up, at least for me.

> there are far more ways for a system to be disordered than ordered

I'm a complete layman when it comes to physics, so forgive me if this is naive — but aren't "ordered" and "disordered" concepts tied to human perception or cognition? It always seemed to me that we call something "ordered" when we can find a pattern in it, and "disordered" when we can't. Different people or cultures might be able to recognize patterns in different states. So while I agree that "there are more ways for a system to be disordered than ordered," I would have thought that's a property of how humans perceive the world, not necessarily a fundamental truth about the universe

You only hear these terms in layman explanations. Physics has precise definitions for these things. When we say "ordered", we mean that a particular macrostate has only few possible microstates.

Check this Wikipedia article for a quick overview: https://en.wikipedia.org/wiki/Microstate_(statistical_mechan...

Details can be found in any textbook on statistical mechanics.

Exactly. The coin flipping example is a very nice way to put it. It works since the coins are interchangeable, you just count the number of heads or tails.

If the coins were of different color and you took that into account, then it wouldn't work.

It's not intuitive to me what gravity has to do with entropy though, as it's classically just a force and completely reversible (unlike entropy)? Ie if you saw a video of undisturbed objects only affected by gravity, you couldn't tell if the video was reversed.

> Ie if you saw a video of undisturbed objects only affected by gravity, you couldn't tell if the video was reversed.

How does that work with things like black holes? If you saw an apple spiral out of a black hole, wouldn't you suspect that you were watching a reversed video? Even if you take account the gravitational waves?

That's the question of why time only goes forwards. It seems to be that the universe started in an extremely low-entropy state. It will go towards high entropy. In a high entropy state (e.g. heat death, or a static black hole), there's no meaningful difference between going forwards or backwards in time - if you reverse all the velocities of the particles, they still just whizz around randomly (in the heat death case) or the black hole stays a black hole.

Classical gravity doesn't work like that. An apple does not spiral into a black hole in space there. It's in an elliptical orbit. (A circle is a special ellipse.)

If you saw a comet coming from the sun, or a meteorite coming from the moon, etc. you would also find that suspicious.

Comets are in elliptical orbits around the sun so literally half the time they're traveling away from the sun.

Sometimes they don’t make it:

https://en.wikipedia.org/wiki/Sungrazing_comet

https://www.space.com/8478-comet-collision-sun-captured-3.ht...

Yea and those happen when other forces than gravity come into play. When matter starts colliding.

Every object has been subject to forces other gravity at some point.

And the point is that sometimes comets do indeed fall into the Sun. If you object to people calling that a comet that’s fine - we can use whatever name you want.

The point is that the gravity interactions are time reversible. Not so with friction etc.

Sure, nothing in the laws of physics prevents a celestial body from distancing itself from the Sun or from the Moon. But it would like suspicious! Wouldn't you suspect that you were watching a reversed video?

You don't understand my point. If we watch purely gravity interactions, the video can be reversed and you wouldn't be able to detect it.

If a meteor crashes into the moon, there's other effects than gravity that makes the video not reversible. Ie it's not only gravity.

That's the point.

In other words, if a comet approached the moon at high speed, missed and slingshotted in another direction, it would be traveling away from the moon, but the video would be time reversible and you couldn't be able to detect it. Gravity only interaction.

I think you don’t understand my point.

Someone wrote “If you saw an apple spiral out of a black hole, wouldn't you suspect that you were watching a reversed video?”

I replied “If you saw a comet coming from the sun, or a meteorite coming from the moon, etc. you would also find that suspicious.”

I don’t know what part do you object to (if any).

> If a meteor crashes into the moon, there's other effects than gravity that makes the video not reversible.

If a meteor (or an apple) is still in a crashing trajectory when you stop recording there are no effects other than gravity. The video is reversible - it just looks weird when you play it in reverse (because the meteor seems to be coming from the Moon and the apple seems to be coming from the black hole if you try to imagine where they were before).

Well yes for example if there was a cannon firing an apple from the moon, then it would travel the same trajectory, just in reverse.

But we know there are no apple firing cannons on the moon.

Ie if an object was coming from the moon and its past trajectory intersected with the moon's surface, you could say "this is reverse video".

That was exactly my point, that an apple coming from a black hole may seem problematic but so does an apple coming from the Sun. I don't see an essential difference regarding the "reversibility" of the physical process (but I could be wrong).

Maybe this thought experiment makes it clear. There's a cannon firing straight up from the lunar pole. One cannot observe the explosion charge. The ball goes straight up, and then falls down and goes back to the barrel. The cannon is filmed from afar firing and you can see the whole ball travel.

Then you are shown two films, the normal and a reversed one. Would you be able to tell which one is which, and if so, how?

> Maybe this thought experiment makes it clear.

Everything was already clear, I think.

Think minimum description length. Low entropy states require fewer terms to fully describe than high entropy states. This is an objective property of the system.

“Number of terms” is a human language construct.

No, it's a representation construct, i.e. how to describe some system in a given basis. The basis can be mathematical. Fourier coefficients for example.

Mathematics is a human language. It being a formal language doesn’t change that.

Further, it’s not objective: you’re choosing the basis which causes the complexity, but any particular structure can be made simple in some basis.

Mathematical notation is a human invention, but the structure that mathematics describes is objective. The choice of basis changes the absolute number of terms, but the relative magnitude of terms for a more or less disordered state is generally fixed outside of degenerate cases.

The structure that most words describe is objective, so you haven’t distinguished math as a language. (Nor is mathematics entirely “objective”, eg, axiom of choice.) And the number of terms in your chosen language with your chosen basis isn’t objective: that’s an intrinsic fact to your frame.

The complexity of terms is not fixed — that’s simply wrong mathematically. They’re dependent on our chosen basis. Your definition is circular, in that you’re implicitly defining “non-degenerate” as those which make your claim true.

You can’t make the whole class simplified at once, but for any state, there exists a basis in which it is simple.

This is getting tedious. The point about mathematics was simply that it carries and objectivity that natural language does not carry. But the point about natural language was always a red-herring; not sure why you introduced it.

>You can’t make the whole class simplified at once

Yes, this is literally my point. The further point is that the relative complexities of two systems will not switch orders regardless of basis, except perhaps in degenerate cases. There is no "absolute" complexity, so your other points aren't relevant.

I didn’t introduce it, you did — by positing that formal language is more objective, as you’ve again done here. My original point was that mathematics is human language.

> The further point is that the relative complexities of two systems will not switch orders regardless of basis, except perhaps in degenerate cases.

Two normal bases: Fourier and wavelet; two normal signals: a wave and an impulse.

They’ll change complexity between the two bases despite everything being normal — the wave simple and impulse complex in Fourier terms; the wave complex and impulse simple in wavelet terms.

That our choice of basis makes a difference is literally why we invented wavelets.

Yes, that is a degenerate case. We can always encode an arbitrary amount of data into the basis functions to get a maximally simple representation for some target signal. If the signal is simple (carries little information) or the basis functions are constructed from the target signal, you can get this kind of degeneracy. But degenerate cases do not invalidate the rule for the general case.

In a deterministic system you can just use the time as a way to describe a state, if you started from a known state.

You're thinking of information entropy, which is not the same concept as entropy in physics. An ice cube in a warm room can be described using a minimum description length as "ice cube in a warm room" (or a crystal structure inside a fluid space), but if you wait until the heat death of the universe, you just have "a warm room" (a smooth fluid space), which will have an even shorter mdl. Von Neuman should never have repurposed the term entropy from physics. Entropy confuses a lot of people, including me.

Maxwell's demon thought experiment implies they are the same concept. Given a complete knowledge of every particle of gas you can in principle create unphysical low entropy distributions of the particles. This[1] goes into more detail.

[1] https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_...

And somewhat surprisingly the heat death of the universe is the maximal entropy state.

Because there are an infinite number of microstates (all the particles are interchangeable) that lead to the same macrostate: nothing happening for ever!

You can safely replace the terms "order" and "disorder" with "unlikely" and "likely". Simply put, entropy is a measure of how closely a system resembles its "most likely configuration". Consider the discrete entropy of a series of coin flips. Three tosses could result in the following 8 states: HHH, HHT, HTH, HTT, THH, THT, TTH, and TTT. From that we can gather that there is a 1/8 chance of getting either zero or three heads and a 3/8 chance of getting one or two heads. That latter two cases are clearly more likely (and hence associated with a higher entropy). In physics of course entropy is generally the continuous kind, not simple set of binary microstates. But the principle is essentially the same.

I think original post is confused exactly because of “tangled chords” analogies. Something being “messy” in our daily lives can be a bit subjective, so using the same analogies for natural forces may seem a tad counterintuitive actually.

Maybe it would be more fitting to say that it just so happens that our human definition of “messy” aligns with entropy, and not that someone decided what messy atoms look like.

I’d say a bucket of water is more neat than a bucket of ice, macroscopically.

>Simple example: why ice melts in a warm room.

Ice melting is simply the water molecules gaining enough kinetic energy (from collisions with the surrounding air molecules) that they break the bonds that held them in the ice crystal lattice. But at the microscopic level it's still just water molecules acting according to Newton's laws of motion (forgetting about quantum effects of course).

Now, back on the topic of the article: consider a system of 2 particles separated by some distance. Do they experience gravity? Of course they do. They start falling towards the midpoint between them. But where is entropy in this picture? How do you even define entropy for a system of 2 particles?

> But where is entropy in this picture? How do you even define entropy for a system of 2 particles?

The answer is that this doesn't happen in a system with only 2 particles. The idea of gravity as an entropic phenomenon is that you introduce some other kind of particle that permeates spacetime, so there is no system that only contains 2 particles. You may use some idea like virtual particles from quantum field theory, or you may define "quanta of space time" as something that is not technically a particle but basically works like one in a handwavy sense.

But the basic point of these entropy based theories is to explain gravity, and typcilaly spacetime itself, as an emergent result of a collection of numerous objects of some kind. This necessarily means that they don't make sense if applied to idealized systems with very few objects - which is why they typically posit such isolated systems simply can't actually exist in reality.

Let me try to answer. Let's say the particles are experiencing gravity as a natural entropy phenomena. They will attract until they become so close that they are now seen as a single particle. The new system has a lower entropy and a higher gravity than before.

Explanation seems very rudimentary but that is the gist of the theory.

From my point of view, I might add the layer of information density. Every quantum fluctuation is an event and the more particles the more information is produced in a defined space volume. But there is no theory of information that is linked to the physics so ...that let me leave as that :).

Can you define quantum fluctuation?

It has been suggested that time too is derived from entropy. At least the single-directionality of it. That’d make entropy one of the most real phenomena in physics.

But "disordered" and "ordered" states are just what we define them to be: for example, cords are "tangled" only because we would prefer arrangements of cords with less knots, and knots form because someone didn't handle the cords carefully.

Physical processes are "real", but entropy is a figment.

I believe you are correct.

Entropy is not a physical quantity, it is a measure of how far a system is from equilibrium.

Lots of people talk about order/disorder or macro and micro states, not realizing these are things we've invented and aren't physical in nature.

> Entropy is not a physical quantity, it is a measure of how far a system is from equilibrium.

That’s funny because the original thermodynamic entropy is defined only for systems in equilibrium.

from who? Clausius?

It doesn't make a lot of sense to me because a system at equilibrium, cannot go undergo any further diffusion, so there's no potential "entropy increase"

Maybe the issue, is that, like an ideal gas, a perfect equilibrium just doesn't occur.

[deleted]

You need some additional assumptions. Only near equilibrium / thermodynamic limit is system linear in entropy. What governs physical processes such as you mention is conservation, dynamics pushing equipartition of energy - but outside that regime these are no longer "theorems".

[deleted]

> Physical entropy governs real physical processes

> the measure is not the same thing as the phenomenon it describes.

There is some tension between those claims.

The latter seems to support the parent comment’s remark questioning whether a “fundamental physical interaction could follow from entropy”.

It seems more appropriate to say that entropy follows from the physical interaction - not to be confused with the measure used to describe it.

One may say that pressure is an entropic force and physical entropy governs the real physical process of gas expanding within a piston.

However, one may also say that it’s the kinetic energy of the gas molecules what governs the physical process - which arguably is a more fundamental and satisfactory explanation.

Bekenstein-Hawking entropy goes up when an Event Horizon increases in radius. That means some mass "falling onto" an EH. So this implies, if our universe is actually a 3D EH, both time and increasing entropy can be explained by one thing: Increasing size of our EH. That is, mass falling onto our EH from outside our universe. It also happens to elegantly replace the Big Bang nonsense theory with something that makes sense. It explains the universe expansion as well. Assuming our universe is a 3D EH makes lots of things make sense that don't otherwise make sense.