BayesServer.Statistics Namespace 
Class  Description  

Entropy 
Calculates entropy, joint entropy or conditional entropy, which can be used to determine the uncertainty in the states of a discrete distribution.
 
JensenShannon 
Methods for computing the Jensen Shannon divergence, which measures the similarity between probability distributions.
 
KullbackLeibler 
Calculate the Kullback–Leibler divergence between 2 distributions with the same variables, D(PQ).
 
MutualInformation 
Calculates mutual information or conditional mutual information, which measures the dependence between two variables.

Enumeration  Description  

LogarithmBase 
Determines the base of the logarithm to use during calculations such as mutual information.
