Calculates entropy, joint entropy or conditional entropy, which can be used to determine the uncertainty in the states of a discrete distribution.
Methods for computing the Jensen Shannon divergence, which measures the similarity between probability distributions.
Calculate the Kullback–Leibler divergence between 2 distributions with the same variables, D(P||Q).
Calculates mutual information or conditional mutual information, which measures the dependence between two variables.
Determines the base of the logarithm to use during calculations such as mutual information.