Skip to main content

Entropy

The Entropy (or Shannon entropy) of a distribution, quantifies the uncertainty or randomness within a probability distribution. It measures how unpredictable or dispersed the outcomes of a system are: higher entropy means more unpredictability, while lower entropy indicates more certainty. In essence, entropy captures the average amount of "surprise" one might expect from each possible outcome. For instance, a fair coin toss, with equal chances of heads or tails, has higher entropy than a biased coin. Entropy is crucial in information theory, where understanding uncertainty helps optimize decisions, minimize errors, and efficiently manage information.

info

A value of zero indicates an outcome that is certain. For example a distribution with evidence set on its variable(s).

The expression H(X) is used to denote the entropy of a variable X. X can also represent groups of variables.

The expression H(X|Z) is used to denote the conditional entropy of X given Z. Again X or Z can be groups of variables.

Support

Variable typesMulti-variateConditionalNotes
DiscreteYesYesMultiple & conditional since 7.12
ContinuousYesYesSince 7.12
HybridYesYesApproximate, since 7.16