Friday, October 22, 2021 1pm to 2pm
About this Event
During Fall Semester 2021, the IACS seminar series will be held virtually. Seminars are free and open to the public but registration is required.
ABSTRACT: Generative Flow Networks (or GFlowNets) have been introduced as a method to sample a diverse set of candidates in an active learning context, with a training objective that makes them approximately sample in proportion to a given reward function. We show a number of additional theoretical properties of GFlowNets. They can be used to estimate joint probability distributions and corresponding marginal distributions (when some variables are unspecified) and are particularly interesting to represent distributions over composite objects like sets and graphs. They amortize in a single but trained generative pass the work typically done by computationally expensive MCMC methods. They can be used to estimate partition functions and free energies, conditional probabilities of supersets or of larger graphs (supergraphs) given a subset of an included subgraph, as well as marginal distributions over all supersets of a set or supergraphs of a graph. The talk will highlight the relations and differences to standard approaches in generative modeling and reinforcement learning and summarize early experimental results obtained in the context of exploring the space of molecules to discover ones with properties of interest.