Entropy In information theory, entropy is a measure of the uncertainty in a random variable. (via Wikipedia)
Almost sounds like a circular definition, but makes sense.