Abstract
Laurent Dinh - Google
Normalizing flows are an flexible family of probability distributions that can serve as generative models for a variety of data modalities. Because flows can be expressed as compositions of expressive functions, they have successfully harnessed recent advances in deep learning. An ongoing challenge in developing these methods is the definition of expressive yet tractable building blocks. In this talk, I will introduce the fundamentals and describe recent work (including my own) on this topic.