# Mutual information

The mutual information (MI) between two variables is a measure of the dependence between them. It quantifies the amount of information that can be gained from one variable through the other.

The expression I(X;Y) is used to denote the mutual information between variables X and Y. X and/or Y can also represent groups of variables.

The expression I(X;Y|Z) is used to denote the conditional mutual information between X and Y given Z. Again X,Y or Z can be groups of variables.

A value of zero indicates no dependence between X and Y.

## Mutual information calculator

Since version 7.13

The mutual information calculator is available from the Analysis tab in the user interface, and can used to calculate mutual information between variables X and Y or groups of variables, and optionally can be conditioned on a variable or variables Z.

Mutual information can be calculated in BITS (base 2) or NATS (base E).

Typically Mutual information involving continuous variables is reported in NATS. This is due to Gaussians belonging to the exponential family of distributions.