My interpretation of entropy is that if you have X states that are equally probable, but not all states are distinct from each other in some sense, then the next state will likely be one where the states satisfying that condition is most numerous.
For example, if you flip N coins, there are 2^N states available once the flip is done. Each outcome has an 1/2^N probability of outcome. There's only one state where all of the states show all heads. While there's only one state where coins numbers 1-N/2 are heads, and N/2-N are tails, so that particular outcome is 1/2^N, if all we care is the macroscopic behavior of "how many heads did we get"--we'll see that we got "roughly" N/2 heads especially as N gets larger.
Entropy is simply saying there's a tendency towards these macroscopically likely groups of states.
A great video and practical demonstration of the concept for anyone interested: https://youtu.be/VCXqELB3UPg?si=lwRxmXLctY1ENu0U