You can safely replace the terms "order" and "disorder" with "unlikely" and "likely". Simply put, entropy is a measure of how closely a system resembles its "most likely configuration". Consider the discrete entropy of a series of coin flips. Three tosses could result in the following 8 states: HHH, HHT, HTH, HTT, THH, THT, TTH, and TTT. From that we can gather that there is a 1/8 chance of getting either zero or three heads and a 3/8 chance of getting one or two heads. That latter two cases are clearly more likely (and hence associated with a higher entropy). In physics of course entropy is generally the continuous kind, not simple set of binary microstates. But the principle is essentially the same.