"Complexity" is a hugely problematic term when used in this way - remember that entropy and complexity are related, but they are not interchangeable. A complex system can have lower entropy than a simpler system, and conversely, a system can have high entropy but be relatively simple. By mingling these terms without specifying objective reference points, it all just comes out as word salad.

This paper just reads like an attempt at sounding smart while actually saying little.

> a system can have high entropy but be relatively simple.

Good examples of these are anything that Kolmogorov-compresses well. For example, by almost any measure the output of a pseudo random number generator has high entropy. Yet it has low information density (low complexity), as the program that generates the sequence, plus its state, is really small.

I think a better example is just hot gas. Heat up a tube of gas, and its entropy will increase, with no effect on its complexity. Still not terribly compressible either though.

Yes indeed. As I understand it, entropy is about states that are more likely.

I wonder if it always increases though? Eventually there will be enough entropy that any change may cause it to reduce or oscillate? (At universe / reachable universe scale).

> I wonder if it always increases though?

It always increases in an isolated system. That caveat is almost always missing in pop-sci level of discussions about entropy, but it is crucial.

> Eventually there will be enough entropy that any change may cause it to reduce or oscillate?

Assuming that the universe is actually an isolated system, entropy will reach a maximum (it cannot oscillate). It is interesting to speculate, and of course our theories are imperfect and we are certainly missing something. In particular, the relationship between time and entropy is not straightforward. Very roughly: is the entropy a function of time, which we could define otherwise, or is time a consequence of entropy changes?

In the first case, we can suppose that if the universe reaches an entropy maximum we’d be far enough outside the conditions under which our theories work that we’d just have entropy decrease with time (i.e., the rule that entropy increases with time is only valid close to our usual conditions).

But in the second case, it would mean that the universe reached the end of time. It could evolve in any conceivable way (in terms of the fundamental laws of Physics), and the arrow of time would always point to the same moment. "What comes after?" Would be a question just as meaningless as "what came before the Big Bang?"

In any case, there are a lot of assumptions and uncertainty. The story does not do the subject any justice.

Yes, we call that state "heat death". Note that the second law is actually that entropy never decreases; it's allowed to stay constant for certain interactions (for instance I'm pretty sure an elastic collision preserves entropy).

that is why the complex is distinct from the complicated