Videos

Lipschitz regularized gradient flows and latent generative particles

Presenter
May 12, 2023
Abstract
Lipschitz regularized f-divergences interpolate between the Wasserstein metric and f-divergences and provide a flexible family of loss functions for non-absolutely continuous distributions (i.e. empirical), possibly with heavy tails. We construct gradient flows based on those divergences taking advantage of neural network spectral normalization (a closely related form of Lipschitz regularization). The Lipschitz regularized gradient flows induce a transport/discriminator particle algorithm where generative particles are moved along a vector field given by the gradient of the discriminator, the latter computed as in generative adversarial networks (GANs). The particle system generates approximate samples from typically high-dimensional distributions known only from data. Examples of such gradient flows are Lipschitz-regularized Fokker-Planck and porous medium equations for Kullback-Leibler and alpha-divergences respectively. Such PDE perspectives allow the analysis of the algorithm’s stability and convergence, for instance through an empirical, Lipschitz regularized, version of Fisher information which tracks the convergence of the algorithms.