This page summarizes the main features of Bayes Server™. For more information please visit the Documentation Center.

User Interface for graphically building Bayesian networks & Dynamic Bayesian networks, which easily connects to a variety of data sources for setting evidence, Parameter learning, charting, Data Sampling, and performing Batch queries.

The user interface is Windows only, however the Bayes Server APIs are cross platform, so any models created can be deployed to Windows, Linux or Mac OS X.

The Bayes Server™ .NET library (API), can easily be called by languages such as C#, F#, VB.NET and C++.NET and many other languages that can interface to .NET libraries. It is a pure cross platform .NET library that has been tested on Windows, Linux and Mac OS X.

For more information, see the .NET API section in the Code center.

The Bayes Server™ Java library (API), is a pure Java library for Java 6 or later and has been tested on Windows, Linux and Mac OS X. It can also been tested with Scala.

For more information, see the Java API section in the Code center.

The Bayes Server .NET and Java APIs allow Bayes Server to be used from R, Python, Excel functions, Matlab & Apache Spark. For more information and sample code see the Code center.

Decision graphs (influence diagrams / LIMIDS) for making decisions under uncertainty.

Dynamic Bayesian networks (DBN) for modelling time series or sequential data.

Bayes Server supports distributed processing on platforms such as Apache Spark and Apache Hadoop, including support for distributed learning of time series models.

Parameter learning, which supports multiple threads, both discrete and continuous variables, Dynamic Bayesian networks (e.g. Time Series), and learning with missing data.

Bayes Server™ supports structural learning (determining the links in a network from data). The algorithm supports discrete, continuous and hybrid networks, and dynamic Bayesian networks for time series and sequence models.

Bayes Server™ supports online learning (adaptation). This can be used to incrementally adjust the parameters in a network , from a stream of data for example. The algorithm supports discrete, latent and noisy nodes.

Automatically extract insight from data. Use Auto Insight to automatically detect large patterns or anomalous patterns from a model.

View help file contentBayes Server™ supports both discrete & continuous latent variables, as well as latent variables in dynamic Bayesian networks for time series and sequence models.

Bayes Server™ also supports missing data in general.

Missing data support allows evidence on some variables to be omitted. As well as support for both discrete & continuous nodes/variables in standard Bayesian networks, support is included for Dynamic Bayesian networks, nodes with multiple variables, and parameter learning.

Bayes Server™ also supports latent variables.

Parameter tuning is used to find the value/range of a parameter value which results in a certain probability (or range) for a variable of interest (the hypothesis variable).

Sensitivity to parameters is used to determine how the probability of a variable of interest (the hypothesis variable) is affected when the value of one or more parameters in the network are changed.

Data Explorer allows evidence to be loaded from a data source such as a spreadsheet or database, and transferred to a network, or charted. Supports discrete and continuous variables as well as Dynamic Bayesian networks (time series).

Query Explorer for displaying probability queries, such as the probability of variables, the history of variables and time series queries. Queries can be saved and restored at a later date.

- Calculate the probability of individual variables given evidence. (e.g. P(A) and P(B) given the evidence.)
- Calculate the joint probability over multiple variables given evidence. (e.g. P(A,B) given the evidence.)
- Calculate the probability of time series variables given evidence (e.g. P(A
_{t=2}) and P(B_{t=5}) given the evidence.) - Calculate the joint probability of time series variables given evidence (e.g.
P(A
_{t=2}, A_{t=5}) given the evidence.) - Calculate a range of time series queries given evidence (e.g. P(A
_{t=1..25}) given the evidence.) - Calculate the log likelihood of evidence.

Batch queries for running predictions against multiple cases in a data source such as a database or spreasheet. Predictions can then be compared and charted.

Calculate the likelihood or log-likelihood of evidence set on a network. Both discrete and continuous variables are supported as well as Dynamic Bayesian networks (time series).

Log likelihood values are often used to detect anomalous data.

As well as hard evidence, support is included for soft (virtual) evidence on both
standard Bayesian networks and Dynamic Bayesian networks. Evidence can be entered
via **Network viewer**, via the Evidence window which allows Copying and Pasting from spreadsheets, or alternatively
evidence can be set using Data Explorer.

As well as discrete variables, Bayes Server™ supports Continuous variables using Conditional Gaussian distributions. Support for continuous variables is also included for Dynamic Bayesian networks (time series).

Data Sampling feature, allowing the generation of sample data to help visualize
networks, and generate test data. Supports both discrete and continuous variables,
Dynamic Bayesian networks (e.g. Time Series), and missing data.

Support for multiple variables per node. This allows, for example, the direct specification of mixtures of full covariance Gaussians.

Support for noisy nodes, including efficient inference and parameter learning.

Bayes Server™ has a number of algorithms for both exact and approximate inference.

Comparison queries are useful for comparing one set of probabilities against another. This is often used to automatically derive insight from a network.

Value of Information (VOI) is a tool for determining which variables are most likely to reduce the uncertainty in a variable of interest.

A tree query determines the resources required to calculate queries on a Bayesian network or dynamic Bayesian network given the current evidence scenario.

Relevance optimization ensures only distributions relevant to a query are used.

Evidence propagation ensures implicit evidence is inferred from any explicit evidence. (e.g. Male=>Pregnant=False)

Support is included for disconnected networks.

Even though Bayes Server™ supports both discrete and continuous variables, sometimes it can be useful to discretize continuous data, generating a discrete variable, where each state represents a continuous interval.

Mesh queries allow visualization of predictions, by generating a 2-D surface plot.

A confusion matrix measures the performance of a Bayesian network when it is used for classification.

A lift chart measures the performance of a Bayesian network when it is used for classification.

Discrete and continuous histograms can be generated based on data loaded in data explorer, or generated by data sampling or batch queries.

Bayes Server supports queries which calculate the most probable configuration (explanation) of nodes/variables without evidence. For example, most probable explanation queries can calculate the most probable sequence in a time series model, using a generalized version of the viterbi algorithm.

Conflict is a measure that detects evidence that is conflicting or rare. The greater the conflict value above zero, the more likely the evidence is in conflict, or rare.

Nodes can be added to a network based on values in a data source such as a database or spreadsheet.

Retract evidence, is an option that allows predictions to be made on variables that have evidence assigned.

The direction of links can be reversed. This will maintain the overall probability distribution of the model, and may induce new links in order to do so.

An optional weight column can be included in a data source, allowing a support/probability (or any positive value) to be associated with a case. This is often used when a dataset contains large numbers of duplicate rows, or can be used to associate a probability to certain cases.