I don't get it.

To me, entropy is not a physical thing, but a measure of our imperfect knowledge about a system. We can only measure the bulk properties of matter, so we've made up a number to quantify how imperfect the bulk properties describe the true microscopic state of the system. But if we had the ability to zoom into the microscopic level, entropy would make no sense.

So I don't see how gravity or any other fundamental physical interaction could follow from entropy. It's a made-up thing by humans.

Your perspective is incorrect.

Physical entropy governs real physical processes. Simple example: why ice melts in a warm room. More subtle example: why cords get tangled up over time.

Our measures of entropy can be seen as a way of summarizing, at a macro level, the state of a system such as that warm room containing ice, or a tangle of cables, but the measure is not the same thing as the phenomenon it describes.

Boltzmann's approach to entropy makes the second law pretty intuitive: there are far more ways for a system to be disordered than ordered, so over time it tends towards higher entropy. That’s why ice melts in a warm room.

My take, for what it's worth,

Entropy isn’t always the driver of physical change, sometimes it’s just a map.

Sometimes that map is highly isomorphic to the physical process, like in gas diffusion or smoke dispersion. In those cases, entropy doesn't just describe what happened, it predicts it. The microstates and the probabilities align tightly with what’s physically unfolding. Entropy is the engine.

But other times, like when ice melts, entropy is a summary, not a cause. The real drivers are bond energies and phase thresholds. Entropy increases, yes, but only because the system overcame physical constraints that entropy alone can’t explain. In this case, entropy is the receipt, not the mechanism.

So the key idea is this: entropy’s usefulness depends on how well it “sees” the real degrees of freedom that matter. When it aligns closely with the substrate, it feels like a law. When it doesn't, it’s more like coarse bookkeeping after the fact.

The second law of thermodynamics is most “real” when entropy is the process. Otherwise, it’s a statistical summary of deeper physical causes.

What makes entropy interesting is that you can describe many physical processes through analysis of the systems degrees of freedom. This pattern repeats regularly despite the systems being radically different.

So you can interpret entropy as being about as real as potential energy or newtons laws. Very useful for calculation, subject to evolution laws which are common across all systems - but potentially gives way as an approximation under a finer grained view (although the finer grained view is also subject to the same rules)

Just swap out "my take, for what it's worth" For "according to ChatGPT"

I don't like your comment, it's pretty low effort, and your tearing down a lot of effort that I have put into learning about physics and entropy over the years.

I came up with the psuedonym Aeonik Chaos circa 2010 because my studies of Chaos Theory and Entropy left a very deep impact on me.

I've been thinking about entropy ever since my dad, a biochemist, embryologist, and doctor told me he thought that entropy was behind the secret of life. That event was over 30 years ago.

I've been thinking about this stuff very deeply for a very long time.

So no, it's not according to ChatGPT, it's just my current take, but I don't to come out and say "this is how it is", because entropy has a way of being sneaky, I've been wrong about theories of entropy way too many times to pretend to be some authority on the matter.

Hence, this is my take, take it or leave it.

By the way, I love using ChatGPT to learn more about science and math, and have it act as sounding board for ideas. It's great for learning and diving deeper into this stuff.

I know it seems low effort but I spent a good ten minutes reading through your other comments looking for an explanation for your post. It reads as so obviously written by an LLM.

I have no doubt you're a real person, and that all those things you claim are true. Almost all of your posts are clearly written by a thoughtful human.

But then I see that occasionally, you post something that looks nothing like your other posts, and exactly like it was written by an LLM. I love learning from ChatGPT too, but I'm curious about your mindset if and when you have it speak for you here. Is the goal to use ChatGPT to educate us? If so, why not give it credit?

Everyone here knows how to use google and ChatGPT (and a majority of us could whip up a script to auto-post LLM responses). It seems like most of the internet is already "dead".. don't give it HN too, please, stick to posting thoughtful human stuff!

I use it to edit some of my posts, and clean up grammar, or to help articulate thoughts that I don't have time to give a proper third or fourth pass to.

I'm very busy these days, and will often use ChatGPT as an editor.

I'll often drop a rough draft into ChatGPT and have it clean it up, or ask or for other perspectives I might be missing.

For the comment in question, I had a spidey sense activated from the conversation. And ChatGPT helped me narrow my focus down to the different levels that entropy apply to, though I had push back on it a bit before it got to the meat of my thoughts.

I think of it as a sort of translator, it helps communicate vibes that the Striatum part of my brain is feeling/sensing, but that the verbal or symbolic part of my brain hasn't figured out to say or express yet.

Normally this process can take a while, and ChatGPT remarkably speeds this cycle up, at least for me.

> there are far more ways for a system to be disordered than ordered

I'm a complete layman when it comes to physics, so forgive me if this is naive — but aren't "ordered" and "disordered" concepts tied to human perception or cognition? It always seemed to me that we call something "ordered" when we can find a pattern in it, and "disordered" when we can't. Different people or cultures might be able to recognize patterns in different states. So while I agree that "there are more ways for a system to be disordered than ordered," I would have thought that's a property of how humans perceive the world, not necessarily a fundamental truth about the universe

You only hear these terms in layman explanations. Physics has precise definitions for these things. When we say "ordered", we mean that a particular macrostate has only few possible microstates.

Check this Wikipedia article for a quick overview: https://en.wikipedia.org/wiki/Microstate_(statistical_mechan...

Details can be found in any textbook on statistical mechanics.

Exactly. The coin flipping example is a very nice way to put it. It works since the coins are interchangeable, you just count the number of heads or tails.

If the coins were of different color and you took that into account, then it wouldn't work.

It's not intuitive to me what gravity has to do with entropy though, as it's classically just a force and completely reversible (unlike entropy)? Ie if you saw a video of undisturbed objects only affected by gravity, you couldn't tell if the video was reversed.

> Ie if you saw a video of undisturbed objects only affected by gravity, you couldn't tell if the video was reversed.

How does that work with things like black holes? If you saw an apple spiral out of a black hole, wouldn't you suspect that you were watching a reversed video? Even if you take account the gravitational waves?

That's the question of why time only goes forwards. It seems to be that the universe started in an extremely low-entropy state. It will go towards high entropy. In a high entropy state (e.g. heat death, or a static black hole), there's no meaningful difference between going forwards or backwards in time - if you reverse all the velocities of the particles, they still just whizz around randomly (in the heat death case) or the black hole stays a black hole.

Classical gravity doesn't work like that. An apple does not spiral into a black hole in space there. It's in an elliptical orbit. (A circle is a special ellipse.)

If you saw a comet coming from the sun, or a meteorite coming from the moon, etc. you would also find that suspicious.

Comets are in elliptical orbits around the sun so literally half the time they're traveling away from the sun.

Sometimes they don’t make it:

https://en.wikipedia.org/wiki/Sungrazing_comet

https://www.space.com/8478-comet-collision-sun-captured-3.ht...

Yea and those happen when other forces than gravity come into play. When matter starts colliding.

Every object has been subject to forces other gravity at some point.

And the point is that sometimes comets do indeed fall into the Sun. If you object to people calling that a comet that’s fine - we can use whatever name you want.

The point is that the gravity interactions are time reversible. Not so with friction etc.

Sure, nothing in the laws of physics prevents a celestial body from distancing itself from the Sun or from the Moon. But it would like suspicious! Wouldn't you suspect that you were watching a reversed video?

You don't understand my point. If we watch purely gravity interactions, the video can be reversed and you wouldn't be able to detect it.

If a meteor crashes into the moon, there's other effects than gravity that makes the video not reversible. Ie it's not only gravity.

That's the point.

In other words, if a comet approached the moon at high speed, missed and slingshotted in another direction, it would be traveling away from the moon, but the video would be time reversible and you couldn't be able to detect it. Gravity only interaction.

I think you don’t understand my point.

Someone wrote “If you saw an apple spiral out of a black hole, wouldn't you suspect that you were watching a reversed video?”

I replied “If you saw a comet coming from the sun, or a meteorite coming from the moon, etc. you would also find that suspicious.”

I don’t know what part do you object to (if any).

> If a meteor crashes into the moon, there's other effects than gravity that makes the video not reversible.

If a meteor (or an apple) is still in a crashing trajectory when you stop recording there are no effects other than gravity. The video is reversible - it just looks weird when you play it in reverse (because the meteor seems to be coming from the Moon and the apple seems to be coming from the black hole if you try to imagine where they were before).

Well yes for example if there was a cannon firing an apple from the moon, then it would travel the same trajectory, just in reverse.

But we know there are no apple firing cannons on the moon.

Ie if an object was coming from the moon and its past trajectory intersected with the moon's surface, you could say "this is reverse video".

That was exactly my point, that an apple coming from a black hole may seem problematic but so does an apple coming from the Sun. I don't see an essential difference regarding the "reversibility" of the physical process (but I could be wrong).

Maybe this thought experiment makes it clear. There's a cannon firing straight up from the lunar pole. One cannot observe the explosion charge. The ball goes straight up, and then falls down and goes back to the barrel. The cannon is filmed from afar firing and you can see the whole ball travel.

Then you are shown two films, the normal and a reversed one. Would you be able to tell which one is which, and if so, how?

> Maybe this thought experiment makes it clear.

Everything was already clear, I think.

Think minimum description length. Low entropy states require fewer terms to fully describe than high entropy states. This is an objective property of the system.

“Number of terms” is a human language construct.

No, it's a representation construct, i.e. how to describe some system in a given basis. The basis can be mathematical. Fourier coefficients for example.

Mathematics is a human language. It being a formal language doesn’t change that.

Further, it’s not objective: you’re choosing the basis which causes the complexity, but any particular structure can be made simple in some basis.

Mathematical notation is a human invention, but the structure that mathematics describes is objective. The choice of basis changes the absolute number of terms, but the relative magnitude of terms for a more or less disordered state is generally fixed outside of degenerate cases.

The structure that most words describe is objective, so you haven’t distinguished math as a language. (Nor is mathematics entirely “objective”, eg, axiom of choice.) And the number of terms in your chosen language with your chosen basis isn’t objective: that’s an intrinsic fact to your frame.

The complexity of terms is not fixed — that’s simply wrong mathematically. They’re dependent on our chosen basis. Your definition is circular, in that you’re implicitly defining “non-degenerate” as those which make your claim true.

You can’t make the whole class simplified at once, but for any state, there exists a basis in which it is simple.

This is getting tedious. The point about mathematics was simply that it carries and objectivity that natural language does not carry. But the point about natural language was always a red-herring; not sure why you introduced it.

>You can’t make the whole class simplified at once

Yes, this is literally my point. The further point is that the relative complexities of two systems will not switch orders regardless of basis, except perhaps in degenerate cases. There is no "absolute" complexity, so your other points aren't relevant.

I didn’t introduce it, you did — by positing that formal language is more objective, as you’ve again done here. My original point was that mathematics is human language.

> The further point is that the relative complexities of two systems will not switch orders regardless of basis, except perhaps in degenerate cases.

Two normal bases: Fourier and wavelet; two normal signals: a wave and an impulse.

They’ll change complexity between the two bases despite everything being normal — the wave simple and impulse complex in Fourier terms; the wave complex and impulse simple in wavelet terms.

That our choice of basis makes a difference is literally why we invented wavelets.

Yes, that is a degenerate case. We can always encode an arbitrary amount of data into the basis functions to get a maximally simple representation for some target signal. If the signal is simple (carries little information) or the basis functions are constructed from the target signal, you can get this kind of degeneracy. But degenerate cases do not invalidate the rule for the general case.

In a deterministic system you can just use the time as a way to describe a state, if you started from a known state.

You're thinking of information entropy, which is not the same concept as entropy in physics. An ice cube in a warm room can be described using a minimum description length as "ice cube in a warm room" (or a crystal structure inside a fluid space), but if you wait until the heat death of the universe, you just have "a warm room" (a smooth fluid space), which will have an even shorter mdl. Von Neuman should never have repurposed the term entropy from physics. Entropy confuses a lot of people, including me.

Maxwell's demon thought experiment implies they are the same concept. Given a complete knowledge of every particle of gas you can in principle create unphysical low entropy distributions of the particles. This[1] goes into more detail.

[1] https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_...

And somewhat surprisingly the heat death of the universe is the maximal entropy state.

Because there are an infinite number of microstates (all the particles are interchangeable) that lead to the same macrostate: nothing happening for ever!

You can safely replace the terms "order" and "disorder" with "unlikely" and "likely". Simply put, entropy is a measure of how closely a system resembles its "most likely configuration". Consider the discrete entropy of a series of coin flips. Three tosses could result in the following 8 states: HHH, HHT, HTH, HTT, THH, THT, TTH, and TTT. From that we can gather that there is a 1/8 chance of getting either zero or three heads and a 3/8 chance of getting one or two heads. That latter two cases are clearly more likely (and hence associated with a higher entropy). In physics of course entropy is generally the continuous kind, not simple set of binary microstates. But the principle is essentially the same.

I think original post is confused exactly because of “tangled chords” analogies. Something being “messy” in our daily lives can be a bit subjective, so using the same analogies for natural forces may seem a tad counterintuitive actually.

Maybe it would be more fitting to say that it just so happens that our human definition of “messy” aligns with entropy, and not that someone decided what messy atoms look like.

I’d say a bucket of water is more neat than a bucket of ice, macroscopically.

>Simple example: why ice melts in a warm room.

Ice melting is simply the water molecules gaining enough kinetic energy (from collisions with the surrounding air molecules) that they break the bonds that held them in the ice crystal lattice. But at the microscopic level it's still just water molecules acting according to Newton's laws of motion (forgetting about quantum effects of course).

Now, back on the topic of the article: consider a system of 2 particles separated by some distance. Do they experience gravity? Of course they do. They start falling towards the midpoint between them. But where is entropy in this picture? How do you even define entropy for a system of 2 particles?

> But where is entropy in this picture? How do you even define entropy for a system of 2 particles?

The answer is that this doesn't happen in a system with only 2 particles. The idea of gravity as an entropic phenomenon is that you introduce some other kind of particle that permeates spacetime, so there is no system that only contains 2 particles. You may use some idea like virtual particles from quantum field theory, or you may define "quanta of space time" as something that is not technically a particle but basically works like one in a handwavy sense.

But the basic point of these entropy based theories is to explain gravity, and typcilaly spacetime itself, as an emergent result of a collection of numerous objects of some kind. This necessarily means that they don't make sense if applied to idealized systems with very few objects - which is why they typically posit such isolated systems simply can't actually exist in reality.

Let me try to answer. Let's say the particles are experiencing gravity as a natural entropy phenomena. They will attract until they become so close that they are now seen as a single particle. The new system has a lower entropy and a higher gravity than before.

Explanation seems very rudimentary but that is the gist of the theory.

From my point of view, I might add the layer of information density. Every quantum fluctuation is an event and the more particles the more information is produced in a defined space volume. But there is no theory of information that is linked to the physics so ...that let me leave as that :).

Can you define quantum fluctuation?

It has been suggested that time too is derived from entropy. At least the single-directionality of it. That’d make entropy one of the most real phenomena in physics.

But "disordered" and "ordered" states are just what we define them to be: for example, cords are "tangled" only because we would prefer arrangements of cords with less knots, and knots form because someone didn't handle the cords carefully.

Physical processes are "real", but entropy is a figment.

I believe you are correct.

Entropy is not a physical quantity, it is a measure of how far a system is from equilibrium.

Lots of people talk about order/disorder or macro and micro states, not realizing these are things we've invented and aren't physical in nature.

> Entropy is not a physical quantity, it is a measure of how far a system is from equilibrium.

That’s funny because the original thermodynamic entropy is defined only for systems in equilibrium.

from who? Clausius?

It doesn't make a lot of sense to me because a system at equilibrium, cannot go undergo any further diffusion, so there's no potential "entropy increase"

Maybe the issue, is that, like an ideal gas, a perfect equilibrium just doesn't occur.

[deleted]

You need some additional assumptions. Only near equilibrium / thermodynamic limit is system linear in entropy. What governs physical processes such as you mention is conservation, dynamics pushing equipartition of energy - but outside that regime these are no longer "theorems".

[deleted]

> Physical entropy governs real physical processes

> the measure is not the same thing as the phenomenon it describes.

There is some tension between those claims.

The latter seems to support the parent comment’s remark questioning whether a “fundamental physical interaction could follow from entropy”.

It seems more appropriate to say that entropy follows from the physical interaction - not to be confused with the measure used to describe it.

One may say that pressure is an entropic force and physical entropy governs the real physical process of gas expanding within a piston.

However, one may also say that it’s the kinetic energy of the gas molecules what governs the physical process - which arguably is a more fundamental and satisfactory explanation.

Bekenstein-Hawking entropy goes up when an Event Horizon increases in radius. That means some mass "falling onto" an EH. So this implies, if our universe is actually a 3D EH, both time and increasing entropy can be explained by one thing: Increasing size of our EH. That is, mass falling onto our EH from outside our universe. It also happens to elegantly replace the Big Bang nonsense theory with something that makes sense. It explains the universe expansion as well. Assuming our universe is a 3D EH makes lots of things make sense that don't otherwise make sense.

Good question. You are absolutely right that entropy is always fundamentally a way to describe are our lack of perfect knowledge of the system [0].

Nevertheless there is a distinct "reality" to entropic forces, in the sense that it is something that can actually be measured in the lab. If you are not convinced then you can look at:

https://en.wikipedia.org/wiki/Entropic_force

and in particular the example that is always used in a first class on this topic:

https://en.wikipedia.org/wiki/Ideal_chain

So when viewed in this way entropy is not just a "made-up thing", but an effective way to describe observed phenomena. That makes it useful for effective but not fundamental laws of physics. And indeed the wiki page says that entropic forces are an "emergent phenomenon".

Therefore, any reasonable person believing in entropic gravity will automatically call gravity an emergent phenomenon. They must conclude that there is a new, fundamental theory of gravity to be found, and this theory will "restore" the probabilistic interpretation of entropy.

The reason entropic gravity is exciting and exotic is that many other searches for this fundamental theory start with a (more or less) direct quantization of gravity, much like one can quantize classical mechanics to arrive at quantum mechanics. Entropic gravity posits that this is the wrong approach, in the same way that one does not try to directly quantize the ideal gas law.

[0] Let me stress this: there is no entropy without probability distributions, even in physics. Anyone claiming otherwise is stuck in the nineteenth century, perhaps because they learned only thermodynamics but not statistical mechanics.

Sure, I'm not denying that entropy exists as a concept, that can be used to explain things macroscopically. But like you said, it's origins are statistical. To me, temperature is also a similar "made up" concept. We can only talk about temperature, because a sufficiently large group of particles will converge to a single-parameter distribution with their velocities. A single particle in isolation doesn't have a temperature.

So if they say gravity might be an entropic effect, does that mean that they assume there's something more fundamental "underneath" spacetime that - in the statistical limit - produces the emergent phenomenon of gravity? So it isn't the entropy of matter that they talk about, but the entropy of something else, like the grains of spacetime of whatever.

When you go deeper into physics, mass and energy don't seem real either, in that, like entropy, they're emergent properties of a system rather than fixed, localized things.

I always thought of the energy of a system (kinetic + potential) as a useful mathematical invariant that helps us predict systems rather than a physical thing. If you put a cart at the top of a hill, then the cart has more potential energy (but only from certain reference frames). It doesn't feel like that potential energy is physical. It doesn't have a specific location; it's a property of the whole Earth-cart system. And yet, it's this total energy that gives rise to the physical properties we're familiar with. In fact, almost all of the mass in your body comes not from the mass of elementary particles, but from the potential energy in the bonds between quarks. Your mass is more than 99% from potential energy.

And then when you get into Quantum Field Theory, it turns out particles (like electrons) are no longer truly point particles but rather another emergent phenomenon from ripples in an underlying field. A particle is just a model that describes it well when looked at from a distance. (I hope I'm not butchering that, as I'm not a physicist.)

So mass, matter, energy, and entropy are all emergent properties of a system rather than being localized, "real" things in the way we'd intuitively think. And at that point, I'm not sure how we would define "real" or why it would be a useful distinction. Is there a useful insight to be gained by putting entropy in a different category of realness than mass?

Yes, exactly. The model is based on (in the first approach) a “lattice” of some type of undiscovered particle-like thing (what they refer to as “qubits” in the article, which is unfortunate because it is NOT the same “qubit” from quantum computing) permeating space time. Or maybe more aptly, it is that lattice from which spacetime emerges. And what we observe as the force of gravity emerges from the entropic forces happening in this lattice.

Im an idiot, let's get that out of the way first. I think that your temperature analogy answered your own question.

I guess my question in turn is, if we imagine a universe at the end of time(?), one that maybe dominated by a few black holes and not much else. Would an observer experience gravity if place sufficiently far enough way? Or even further, if nothing is left in the universe at all. Assuming that doesn't cause a big crunch, rip, or whatever...

> You are absolutely right that entropy is always fundamentally a way to describe are our lack of perfect knowledge of the system [0].

> [0] Let me stress this: there is no entropy without probability distributions, even in physics.

The second item doesn't entail the first. Probabilities can be seen as a measure of lack of knowledge about a system, but it isn't necessarily so. A phenomenon can also be inherently/fundamentally probabilistic. For example, wave function collapse is, to the best of our knowledge, an inherently non-deterministic process. This is very relevant to questions about the nature of entropy - especially since we have yet to determine if it's even possible for a large system to be in a non-collapsed state.

If it turns out that there is some fundamental process that causes wave function collapse even in perfectly isolated quantum systems, then it would be quite likely that entropy is related to such a process, and that it may be more than a measure of our lack of knowledge about the internal state of a system, and instead a measurement of the objective "definiteness" of that state.

I am aware that objective collapse theories are both unpopular and have some significant hurdles to overcome - but I also think that from a practical perspective, the gap between the largest systems we have been able to observe in pure states versus the smallest systems we could consider measurement devices is still gigantic and leaves us quite a lot of room for speculation.

Entropy isn't a function of imperfect knowledge. It's a function of the possible states of a system and their probability distributions. Quantum mechanics assumes, as the name implies, that reality at the smallest level can be quantised, so it's completely appropriate to apply entropy to describing things at the microscopic scale.

If we knew the exact state of all particles in an enclosed system, we can calculate what future states will be exactly. No need to calculate possible states.

Quantum uncertainty actually says no to this. There is an ‘error’ in any propagating probability field.

Since that's not possible in any physical system of one or more particles, it's irrelevant.

> Entropy isn't a function of imperfect knowledge. It's a function of the possible states of a system and their probability distributions.

There are no probability distributions over possible states when there is perfect knowledge of the state.

> Quantum mechanics

Entropy is also zero for a pure quantum state. You won’t have entropy without imperfect knowledge.

> There are no probability distributions over possible states when there is perfect knowledge of the state.

I know very little about physics but I thought that the leading interpretations of quantum physics say that the probability distribution is all we can know about a system. The entropy is not due to due to a lack of information about the quantum state, but because the outcomes are inherently stochastic?

Entropy is about the state - not about “outcomes”.

“All we can know” is the precise state - at least in principle - and entropy is zero in that case.

Just look at the definition of entropy. Knowledge about a system never enters the equation.

S := -k_B sum p_i ln (p_i)

As the other replier said, despite your dismissiveness, the knowledge about the system is in the probabilities, so it's right there in the equation.

Suppose you flip a coin. Before flipping the coin, your knowledge is "heads or tails". After flipping it, your knowledge becomes one of either heads or tails. The amount of information you gained by resolving your imperfect knowledge is the entropy of the distribution.

The same model works for physical entropy without much modification; the imperfect knowledge is the difference between knowing a macrostate versus the exact microstate.

You’re glossing over an important point: your knowledge of the future state of the system is “heads or tails”.

One of the things entropy tells us how a system is likely to evolve in future. But looking at this another way, entropy actually helps dictate how it will evolve in future. And we can prove that mathematically.

What are you talking about?

Entropy doesn't tell us that at all. Entropy tells us how much data is encodes between two ways of describing a value. It has nothing to do with time at all.

You're confusing two different perspectives on entropy without understanding either properly. In physics, which is what's being discussed, entropy is certainly closely related to time.

p_i

Edit to add lots of words:

In the definition of entropy

S := -k_B sum p_i ln (p_i)

knowledge about the system enters the equation in the p_i terms.

The other term is a constant so it’s not like there are many other choices to link the entropy to the system!

Please communicate in full sentences with me.

I can only guess that your objection is something about probabilities. A microstate has a probability independent of my knowledge of the system just like the probability of having a royal flush doesn't change after drawing five cards. The probability of me ending the game with a royal flush might, but that is not what we mean by these probabilities.

The same microstate will have different probabilities depending on what are the constraints or measurements used in _your_ description of the system.

If you choose to describe the system using its microstate - and you know it - there are no probabilities anywhere.

You can of course know something and choose to ignore it - the entropy is still a reflection of the uncertainty (actual or for the sake of a lower-resolution model).

But the point is that, regardless of how you choose to describe or even measure the system, it will need exactly as much heat to raise its temperature by 1 degree (or it will need as much kinetic energy to increase the average velocity of the constituents by the same amount, in the microstate framework). So there is some objective nature to entropy, it's not merely a function of subjective knowledge of a system. Or, to put it another way, two observers with different amounts of information on the microstate of a system will still measure it as having the same entropy.

There is some objective nature to the operational definition of entropy based on an experimental setup where you fix the volume and measure the temperature or whatever.

And this is related to the statistical mechanical definition of entropy based on the value of the corresponding state variables.

But it’s not a property of the microstate - it’s a property of the macrostate which makes sense only in the context of the experimental constraints and measurements.

If we relate entropy to work that can be extracted someone with a better understanding of the state of the system and operational access to additional degrees of freedom can extract additional work.

Thermodynamics assumes the state variables provide a complete description of the system. Statistical mechanics assumes the state variables provide an incomplete description of the system - and work out what that entails.

> But it’s not a property of the microstate - it’s a property of the macrostate which makes sense only in the context of the experimental constraints and measurements.

The same can be said about the wavefunction then, right? You can't directly observe it, you can only use it to predict the statistics of a particular experimental setup. So, at worse, entropy is as real as wavefunction amplitudes.

> If we relate entropy to work that can be extracted someone with a better understanding of the state of the system and operational access to additional degrees of freedom can extract additional work.

Is this actually true? Per my understanding, if I give you three containers, two of which are filled with some kind of gas that you know nothing about, and the third with a mix of those same gases, you can measure their entropy using thermodynamic experiments and tell which of the three is a mix of the other two because it will have a higher entropy. So, you can extract more work from one of the boxes despite not knowing anything more about it.

> Per my understanding

What’s the source of that understanding? You cannot measure the entropy, only changes of entropy - which will be the same (for an ideal gas).

Edit: we already had this discussion, by the way: https://news.ycombinator.com/item?id=42434862

> You cannot measure the entropy, only changes of entropy

You can measure the changes in entropy from a minimal state and integrate - and you'll get the "total" entropy.

And thanks for looking it up! I remembered a very similar conversation and was wondering if you were the same person, but was a bit lazy to search :)

> You can measure the changes in entropy from a minimal state and integrate - and you'll get the "total" entropy

That doesn’t help with the following (at least if you keep those kinds of gas in gas state):

> if I give you three containers […] you can measure their entropy using thermodynamic experiments and tell which of the three is a mix of the other two because it will have a higher entropy

But you can weight them, it’s much easier.

The way we use the word 'entropy' in computer science is different from how its used in physics. Here is a really good explanation in a great talk! https://youtu.be/Kr_S-vXdu_I?si=1uNF2g9OhtlMAS-G&t=2213

Entropy is certainly a physical “thing”, in the sense that it affects the development of the system. You can equally well apply your argument that it isn’t a physical thing because it doesn’t exist on a microscopic scale to temperature. Temperature doesn’t exist when you zoom in on single particles either.

There’s no reason to involve our knowledge of the system. Entropy is a measure of the number of possible micro states for a given system, and that number exists independently of us.

Exactly! Temperature isn't fundamental. It's a statistical measure made up by humans. It's certainly a very useful abstraction but it isn't a fundamental property. It describes an emergent pattern observed in larger systems. Same for entropy (and also AFAIK angular momentum).

It's entirely possible I'm wrong about any of the above but if so I've yet to encounter a convincing line of reasoning.

With the same line of reasoning you can argue (and some have) that numbers aren’t fundamental and are only made up by humans. In my view, the number of swans in the pond is three whether humans exist or not.

I think that's conflating distinct concepts. Numbers aren't a physical property. Whether or not they're a fundamental concept is largely a philosophical question.

Meanwhile temperature (for example), while physical, is a statistical property of a macroscopic system. It isn't fundamental - rather it's an abstraction over a (very) large set - but it is nonetheless a quantification of physical characteristics. Whereas a number, for example 2 rocks, is not quantifying any physical property of the individual rocks themselves.

The existence of a rock is definitely a physical property, and thus is the number of rocks in a collection a physical property, at least as much as temperature is it for a gas.

Numbers aren't "existence" they're a property of an abstract set. The observer draws a fairly arbitrary mental line around a cluster of objects. Whether the set numbers two, three, or four cannot be determined by observing one of the rocks in isolation.

Fundamental physical properties don't depend on other members of an arbitrary set as determined by an observer.

> at least as much as temperature is

But here I've been explicitly claiming that temperature isn't. It is a statistical abstraction over a physical property but it is not itself a physical property in any inherent sense. That's ... kind of the entire point I've been trying to make about an entire class of concepts.

Why would you need to observe anything in isolation for it to be a physical property? This is the proposition that I find absurd. It’s like saying that a hydrogen atom isn’t really a hydrogen atom, but just an arbitrary set of nucleons.

> Entropy is a measure of the number of possible micro states for a given system, and that number exists independently of us.

That number also exists independently of the system! I can imagine any system and calculate the corresponding number.

(And for an even more philosophical question, does the “system” really exist independently of us? What separates the “system” from anything else? Is every subset of the universe a “system”?)

Something to consider is that entropy has units of measure. Why would a purely philosophical concept be given units of Joules per Kelvin?

I'm only reminded about this because, though I'm a physicist, I've been out of school for more than 3 decades, and decided that I owed myself a refresher on thermodynamics. This coincided with someone on HN recommending David Tong's textbook-quality lecture notes:

https://www.damtp.cam.ac.uk/user/tong/statphys.html

I think at least the first few pages are readable to a layperson, and address the issue of our imperfect knowledge of the precise configuration of a system containing, say, 1e23 particles.

But if we knew all of those relationships, the system would still have entropy.

This comment thread is exhibit N-thousand that "nobody really understands entropy". My basic understanding goes like this:

In thermodynamics, you describe a system with a massive number of microstates/dynamical variable according to 2-3 measurable macrostate variables. (E.g. `N, V, E` for an ideal gas.)

If you work out the dynamics of those macrostate variables, you will find that (to first order, i.e. in the thermodynamic limit) they depend only on the form of the entropy function of the system `S(E, N, V)`, e.g. Maxwell relations.

If you measured a few more macrostate variables, e.g. the variance in energy `sigma^2(E)` and the center of mass `m`, or anything else, you would be able to write new dynamical relations that depend on a new "entropy" `S(E, N, V, sigma^2(E), m)`. You could add 1000 more variables, or a million—e.g every pixel of an image—basically up until the point where the thermodynamic limit assumptions cease to hold.

The `S` function you'd get will capture the contribution of every-variable-you're-marginalizing-over to the relationships between the remaining variables. This is the sense in which it represents "imperfect knowledge". Entropy dependence arises mathematically in the relationships between macrostate variables—they can only couple to each by way of this function which summarizes all the variables you don't know/aren't measuring/aren't specifying.

That this works is rather surprising! It depends on some assumptions which I cannot remember (on convexity and factorizeabiltiy and things like that), but which apply to most or maybe all equilibrium thermodynamic-scale systems.

For the ideal gas, say, the classical-mechanics, classical-probability, and quantum-mechanic descriptions of the system all reduce to the same `S(N, V, E)` function under this enormous marginalization—the most "zoomed-out" view of their underlying manifold structures turns out to be identical, which is why they all describe the same thing. (It is surprising that seemingly obvious things like the size of the particles would not matter. It turns out that the asymptotic dynamics depend only on the information theory of the available "slots" that energy can go into.)

All of this appears as an artifact of the limiting procedure in the thermodynamic limit, but it may be the case that it's more "real" than this—some hard-to-characterize quantum decoherence may lead to this being not only true in an extraordinarily sharp first-order limit, but actually physically true. I haven't kept up with the field.

No idea how to apply this to gravity though.

Entropy can be defined as the logarithm of the number of microstates in a macrostate. Since transition between microstates is reversible, and therefore one-to-one (can't converge on any particular microstate, can't go in cycles, have to be something like a random walk) we're more likely to end up in a macrostate that holds a larger number of microstates.

For example, there are many more ways your headphone cord can be tangled than untangled, so when you pull it out of your pocket, and it's in a random state, then it's very likely to be tangled.

If entropy causes gravity, that means there are more somehow more microstates with all the mass in the universe smooshed together than microstates with all the mass in the universe spread apart.

Even if we take that view, gravity is still basically a similar case. What we call "gravity" is really an apparent force, that isnt a force at all when seen from a full 4d pov.

Imagine sitting outside the whole universe from t=0,t=end and observing one whole block. Then the trajectories of matter, unaffected by any force at all, are those we call gravitational.

From this pov, it makes a lot more sense to connect gravity with some orderly or disorderly features of these trajectories.

Inertia, on this view, is just a kind of hysteresis the matter distribution of the universe has -- ie., a kind of remembered deformation that persists as the universe evolves.

> From this pov, it makes a lot more sense to connect gravity with some orderly or disorderly features of these trajectories.

On the contrary, entropic gravity works pretty well for the Newtonian view of gravity as a force, and not the GR view of gravity as a deformation of space time and analogous to acceleration. Acceleration is a very elementary concept, one you find even in microscopic descriptions. Gravity being essentially the same thing makes it far more elementary than a concept like entropy, which only applies to large groups of particles.

So, if the GR picture is the right one, if gravity and acceleration are essentially the same thing, its very hard to see how that aligns with gravity being an emergent phenomenon that only happens at large scales. However, if gravity is just a tendency for massive objects to come together, as in the Newtonian picture, that is perfectly easy to imagine as an entropic effect.

If you want to only have one possible past (i.e. can't destroy information) then when you end up in one branch of quantum state you need to "store" enough information to separate you form other branches and you really do need to have multiple possible microstates to differentiate them. If you look post-factum obviously you did end up in a specific state, but statistics do their work otherwise.

For years I thought the same for entropy. But now I believe it is fundamentaly impossible to know each micro state, irrespective our tools and methods. And this happens like and due to Heisenberg's uncertainty principle.

So all events are irreversible and entropy is always increasing. Perfection is only theoretical.

It sounds like you're talking about information entropy which to my understanding is analogue to but not the same as entropy in physics?

It pretty much is the same, except that entropy in physics usually has a constant in front of it.

[deleted]

Entropy is the opposite of potential

Entropy is complicated beyond just a Rankine or Carnot cycle.

Biology thrives at the ebbs, flows, and eddies of entropy. Predation. Biochemical flux. There are arrows flowing every which way, and systems that keep it finely tuned.

This theory, based on my surface level reading and understanding, is that the aggregate particle-level entropy within sub light speed systems creates gravity.

> It's a made-up thing by humans.

All of physics is made up by humans.