Some other features in Bayesian inference
Lazy Propagation uses a secondary structure called the “Junction Tree” to perform the inference.
In [1]:
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb
bn=gum.loadBN("res/alarm.dsl")
gnb.showJunctionTreeMap(bn);
But this junction tree can be transformed to build different probabilistic queries.
In [2]:
bn=gum.fastBN("A->B->C->D;A->E->D;F->B;C->H")
ie=gum.LazyPropagation(bn)
bn
Out[2]:
Evidence impact
Evidence Impact allows the user to analyze the effect of any variables on any other variables
In [3]:
ie.evidenceImpact("B",["A","H"])
Out[3]:
|
| ||
---|---|---|---|
| 0.4631 | 0.5369 | |
0.5761 | 0.4239 | ||
| 0.3879 | 0.6121 | |
0.4996 | 0.5004 |
Evidence impact is able to find the minimum set of variables which effectively conditions the analyzed variable
In [4]:
ie.evidenceImpact("E",["A","F","B","D"]) # {A,D,B} d-separates E and F
Out[4]:
|
| |||
---|---|---|---|---|
|
| 0.1907 | 0.8093 | |
0.3157 | 0.6843 | |||
| 0.1025 | 0.8975 | ||
0.4230 | 0.5770 | |||
|
| 0.2897 | 0.7103 | |
0.4440 | 0.5560 | |||
| 0.1651 | 0.8349 | ||
0.5592 | 0.4408 |
In [5]:
ie.evidenceImpact("E",["A","B","C","D","F"]) # {A,C,D} d-separates E and {B,F}
Out[5]:
|
| |||
---|---|---|---|---|
|
| 0.3251 | 0.6749 | |
0.0133 | 0.9867 | |||
| 0.4546 | 0.5454 | ||
0.0229 | 0.9771 | |||
|
| 0.0633 | 0.9367 | |
0.4591 | 0.5409 | |||
| 0.1047 | 0.8953 | ||
0.5950 | 0.4050 |
Evidence Joint Imapct
In [6]:
ie.evidenceJointImpact(["A","F"],["B","C","D","E","H"]) # {B,E} d-separates [A,F] and [C,D,H]
Out[6]:
|
| |||
---|---|---|---|---|
|
| 0.0977 | 0.3931 | |
0.0170 | 0.4922 | |||
| 0.0173 | 0.5420 | ||
0.0696 | 0.3711 | |||
|
| 0.1561 | 0.3627 | |
0.0272 | 0.4541 | |||
| 0.0282 | 0.5096 | ||
0.1133 | 0.3489 |
In [ ]: