Namespace BayesServer.Statistics
Classes
Entropy
Calculates entropy, joint entropy or conditional entropy, which can be used to determine the uncertainty in the states of a discrete distribution.
JensenShannon
Methods for computing the Jensen Shannon divergence, which measures the similarity between probability distributions.
KullbackLeibler
Calculate the Kullback–Leibler divergence between 2 distributions with the same variables, D(P||Q).
MutualInformation
Calculates mutual information or conditional mutual information, which measures the dependence between two variables.
Enums
LogarithmBase
Determines the base of the logarithm to use during calculations such as mutual information.