The Entropy (or Shannon entropy) of a distribution is a measure of its uncertainty.
A value of zero indicates an outcome that is certain. For example a distribution with evidence set on its variable(s).
The expression H(X) is used to denote the entropy of a variable X. X can also represent groups of variables.
The expression H(X|Z) is used to denote the conditional entropy of X given Z. Again X or Z can be groups of variables.
|Discrete||Yes||Yes||Multiple & conditional since 7.12|
|Hybrid||Yes||Yes||Approximate, since 7.16|