How is this done and what do “theories shared across examples” refer to?
This makes GFlowNets amortized probabilistic inference machines that can be used both to sample latent variables (as in [14]) or parameters and theories shared across examples (as in [5]).
TO_BE_ANSWERED.
How is approximate and amortized marginalization done with GFlowNet?
They can also be used to perform approximate and amortized marginalization (without the need for sampling at run-time).
TO_BE_ANSWERED.
Find examples of GFlowNet answering probabilistic questions in real life.
This is useful for training AI systems that can answer probabilistic questions
TO_BE_ANSWERED.
Leave a comment