We have built working networks with over 10 thousand nodes, and even tested a simple network with a million nodes, however we cannot provide
any guarantees on network size for the reasons listed below.
Instead we recommend that you test your particular scenario with our evaluation edition. Also, if you do run into performance or memory problems, there are many techniques that can be used to refactor your network, or you can of course use approximate inference.
The rules are complex, however as a general rule of thumb, tree structures are efficient, and more densely linked graphs are more resource intensive.
Techniques like Noisy nodes, Latent Variables or Divorcing can be used to reduce complexity.
As a rule of thumb, when more variables are instantiated, distributions can be simplified and hence inference is more efficient.
Often an application does not need to query every variable in the network, and in some instances inference can be simplified by only querying the variables/distributions you need.