# Log likelihood

## Introduction

When evidence is entered in a Bayesian network or Dynamic Bayesian network, the Probability (likelihood) of that evidence, denoted $P\left(e\right)$ can be calculated.

The Probability of evidence $P\left(e\right)$ indicates how likely it is that the network could have generated that data. The lower the value, the less likely.

Note: The Log Likelihood $Log\left(P\left(e\right)\right)$ is also reported, as the $P\left(e\right)$ can often report zero, due to underflow caused by the repeated multiplication of small probability values, using floating point arithmetic.

An example of zero likelihood:

Note: Log Likelihood values are often used to detect unusual data, known as Anomaly detection.

## Range of values

The likelihood $P\left(e\right)$ in networks with only discrete nodes lies in the range [0, 1], therefore the log-likelihood lies in the range [-Infinity, 0]. For networks that contain one or more continuous nodes (with or without discrete nodes) the likelihood (pdf) lies in the range [0, +Infinity], therefore the log-likelihood lies in the range [-Infinity, +Infinity].

## Log-likelihood -> Probability

While log-likelihood values from the same model can be easily compared, the absolute value of a log-likelihood is somewhat arbitrary and model dependent. The `HistogramDensity` class in the API can be used to build a distribution of Log-Likelihood values for a model which can then be used to convert log-likelihood values to a value in the range [0,1].

This techniques is often used in anomaly detection applications when we wish to report the health of a system as a single meaningful value.