Generative flow nets
Gflownets
November 11, 2021 — February 13, 2023
Placeholder.
A concept that Yoshua Bengio is excited about that includes several keywords with which I am familiar (generative, flow) but I believe is distinct from the concatenation of those terms.
Y. Bengio et al. (2022):
Generative Flow Networks (GFlowNets) have been introduced as a method to sample a diverse set of candidates in an active learning context, with a training objective that makes them approximately sample in proportion to a given reward function. In this paper, we show a number of additional theoretical properties of GFlowNets. They can be used to estimate joint probability distributions and the corresponding marginal distributions where some variables are unspecified and, of particular interest, can represent distributions over composite objects like sets and graphs. GFlowNets amortize the work typically done by computationally expensive MCMC methods in a single but trained generative pass. They could also be used to estimate partition functions and free energies, conditional probabilities of supersets (supergraphs) given a subset (subgraph), as well as marginal distributions over all supersets (supergraphs) of a given set (graph). We introduce variations enabling the estimation of entropy and mutual information, sampling from a Pareto frontier, connections to reward-maximizing policies, and extensions to stochastic environments, continuous actions and modular energy functions.
- Generative Flow Networks - Yoshua Bengio
- The GFlowNet Tutorial
- GFlowNet Tutorial
- Matt Biggs, The What, Why and How of Generative Flow Networks
- Flow Network based Generative Models for Non-Iterative Diverse Candidate Generation