As the other replier said, despite your dismissiveness, the knowledge about the system is in the probabilities, so it's right there in the equation.

Suppose you flip a coin. Before flipping the coin, your knowledge is "heads or tails". After flipping it, your knowledge becomes one of either heads or tails. The amount of information you gained by resolving your imperfect knowledge is the entropy of the distribution.

The same model works for physical entropy without much modification; the imperfect knowledge is the difference between knowing a macrostate versus the exact microstate.

You’re glossing over an important point: your knowledge of the future state of the system is “heads or tails”.

One of the things entropy tells us how a system is likely to evolve in future. But looking at this another way, entropy actually helps dictate how it will evolve in future. And we can prove that mathematically.

What are you talking about?

Entropy doesn't tell us that at all. Entropy tells us how much data is encodes between two ways of describing a value. It has nothing to do with time at all.

You're confusing two different perspectives on entropy without understanding either properly. In physics, which is what's being discussed, entropy is certainly closely related to time.