Just look at the definition of entropy. Knowledge about a system never enters the equation.

S := -k_B sum p_i ln (p_i)

As the other replier said, despite your dismissiveness, the knowledge about the system is in the probabilities, so it's right there in the equation.

Suppose you flip a coin. Before flipping the coin, your knowledge is "heads or tails". After flipping it, your knowledge becomes one of either heads or tails. The amount of information you gained by resolving your imperfect knowledge is the entropy of the distribution.

The same model works for physical entropy without much modification; the imperfect knowledge is the difference between knowing a macrostate versus the exact microstate.

You’re glossing over an important point: your knowledge of the future state of the system is “heads or tails”.

One of the things entropy tells us how a system is likely to evolve in future. But looking at this another way, entropy actually helps dictate how it will evolve in future. And we can prove that mathematically.

What are you talking about?

Entropy doesn't tell us that at all. Entropy tells us how much data is encodes between two ways of describing a value. It has nothing to do with time at all.

You're confusing two different perspectives on entropy without understanding either properly. In physics, which is what's being discussed, entropy is certainly closely related to time.

p_i

Edit to add lots of words:

In the definition of entropy

S := -k_B sum p_i ln (p_i)

knowledge about the system enters the equation in the p_i terms.

The other term is a constant so it’s not like there are many other choices to link the entropy to the system!

Please communicate in full sentences with me.

I can only guess that your objection is something about probabilities. A microstate has a probability independent of my knowledge of the system just like the probability of having a royal flush doesn't change after drawing five cards. The probability of me ending the game with a royal flush might, but that is not what we mean by these probabilities.

The same microstate will have different probabilities depending on what are the constraints or measurements used in _your_ description of the system.

If you choose to describe the system using its microstate - and you know it - there are no probabilities anywhere.

You can of course know something and choose to ignore it - the entropy is still a reflection of the uncertainty (actual or for the sake of a lower-resolution model).

But the point is that, regardless of how you choose to describe or even measure the system, it will need exactly as much heat to raise its temperature by 1 degree (or it will need as much kinetic energy to increase the average velocity of the constituents by the same amount, in the microstate framework). So there is some objective nature to entropy, it's not merely a function of subjective knowledge of a system. Or, to put it another way, two observers with different amounts of information on the microstate of a system will still measure it as having the same entropy.

There is some objective nature to the operational definition of entropy based on an experimental setup where you fix the volume and measure the temperature or whatever.

And this is related to the statistical mechanical definition of entropy based on the value of the corresponding state variables.

But it’s not a property of the microstate - it’s a property of the macrostate which makes sense only in the context of the experimental constraints and measurements.

If we relate entropy to work that can be extracted someone with a better understanding of the state of the system and operational access to additional degrees of freedom can extract additional work.

Thermodynamics assumes the state variables provide a complete description of the system. Statistical mechanics assumes the state variables provide an incomplete description of the system - and work out what that entails.

> But it’s not a property of the microstate - it’s a property of the macrostate which makes sense only in the context of the experimental constraints and measurements.

The same can be said about the wavefunction then, right? You can't directly observe it, you can only use it to predict the statistics of a particular experimental setup. So, at worse, entropy is as real as wavefunction amplitudes.

> If we relate entropy to work that can be extracted someone with a better understanding of the state of the system and operational access to additional degrees of freedom can extract additional work.

Is this actually true? Per my understanding, if I give you three containers, two of which are filled with some kind of gas that you know nothing about, and the third with a mix of those same gases, you can measure their entropy using thermodynamic experiments and tell which of the three is a mix of the other two because it will have a higher entropy. So, you can extract more work from one of the boxes despite not knowing anything more about it.

> Per my understanding

What’s the source of that understanding? You cannot measure the entropy, only changes of entropy - which will be the same (for an ideal gas).

Edit: we already had this discussion, by the way: https://news.ycombinator.com/item?id=42434862

> You cannot measure the entropy, only changes of entropy

You can measure the changes in entropy from a minimal state and integrate - and you'll get the "total" entropy.

And thanks for looking it up! I remembered a very similar conversation and was wondering if you were the same person, but was a bit lazy to search :)

> You can measure the changes in entropy from a minimal state and integrate - and you'll get the "total" entropy

That doesn’t help with the following (at least if you keep those kinds of gas in gas state):

> if I give you three containers […] you can measure their entropy using thermodynamic experiments and tell which of the three is a mix of the other two because it will have a higher entropy

But you can weight them, it’s much easier.