Entropy In information theory, entropy is a measure of the uncertainty in a random variable. (via Wikipedia) You might also like some of these
Almost sounds like a circular definition, but makes sense.