Skip to main content


The Entropy (or Shannon entropy) of a distribution is a measure of its uncertainty.


A value of zero indicates an outcome that is certain. For example a distribution with evidence set on its variable(s).

The expression H(X) is used to denote the entropy of a variable X. X can also represent groups of variables.

The expression H(X|Z) is used to denote the conditional entropy of X given Z. Again X or Z can be groups of variables.


Variable typesMulti-variateConditionalNotes
DiscreteYesYesMultiple & conditional since 7.12
ContinuousYesYesSince 7.12
HybridYesYesApproximate, since 7.16