You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I modelled a Bayesian Network using pgmpy and wrote it to my local disk in BIF and XMLBIF formats.
I would like to have my query with less time complexity. To test, I loaded the same BN model but written in two different formats. Both the loaded models have different query responses with the same evidence.
I compared it with another library (Bayes server API) and the XML-written pgmpy model and the Bayes server model have the same response on query with the same evidence and are different to BIF written pgmpy model.
Could someone help me where the difference between these formats?
Which one is accurate? Why does parsing consume time for BIF formats, is BIF Format suitable for real time inference?
Update: I am using "pgmpy.inference.ExactInference.VariableElimination" for my query
I am writing the same model when saving locally
Thank you very much in advance.
Vamsi
Your environment
pgmpy version 0.1.20
Python version 3.9.16
Operating System Windows 10
Steps to reproduce
Tell us how to reproduce this issue. Please provide a minimal reproducible code of
the issue you are facing if possible.
Expected behaviour
Tell us what should happen
Actual behaviour
Tell us what happens instead
The text was updated successfully, but these errors were encountered:
nvk0015
changed the title
Different Query results on loading same BN model written to different formats.
Different Query results on loading same BN model written to different formats and loading model written to different formats querying the same evidence.
Dec 12, 2023
nvk0015
changed the title
Different Query results on loading same BN model written to different formats and loading model written to different formats querying the same evidence.
Different Query results with same evidence after writing same model to different formats, loading different formats (same model was written)
Dec 12, 2023
@nvk0015 The results of inference should be the same no matter which file format you wrote the network to. Would it be possible for you to share the saved network files (and the query and evidence variables)? I can then try to reproduce the issue and try to figure what the problem is.
Reading the BIF file format is slow becasue it supports unicode characters in variable and state names, and because of the large number of possible characters the parsing is slow. A possible option to speed it up could be to add an argument to the BIFReader class to specify the encoding scheme to use when parsing files. Just to give an idea of runtime improvement, with ASCII encoding, the reader is around 10x faster.
Subject of the issue
Hi @fabriziov @linzhp @sroecker @saketkc ,
I modelled a Bayesian Network using pgmpy and wrote it to my local disk in BIF and XMLBIF formats.
I would like to have my query with less time complexity. To test, I loaded the same BN model but written in two different formats. Both the loaded models have different query responses with the same evidence.
I compared it with another library (Bayes server API) and the XML-written pgmpy model and the Bayes server model have the same response on query with the same evidence and are different to BIF written pgmpy model.
Could someone help me where the difference between these formats?
Which one is accurate? Why does parsing consume time for BIF formats, is BIF Format suitable for real time inference?
Update: I am using "pgmpy.inference.ExactInference.VariableElimination" for my query
I am writing the same model when saving locally
Thank you very much in advance.
Vamsi
Your environment
Steps to reproduce
Tell us how to reproduce this issue. Please provide a minimal reproducible code of
the issue you are facing if possible.
Expected behaviour
Tell us what should happen
Actual behaviour
Tell us what happens instead
The text was updated successfully, but these errors were encountered: