Videos

Function-space regularized divergences for machine learning applications

May 11, 2023
Abstract
Divergences such as Kullback-Leibler, Rényi and f-divergence play an increasingly important role in probabilistic machine learning offering a notion of distance between probability distributions. In the recent past, divergence estimation has been developed on the premise of variational formulas and function parametrization via neural networks. Despite the successes, the statistical estimation of a divergence is still considered a very challenging problem mainly due to high variance of the neural-based estimators. Particularly, hard cases include high dimensional data, large divergence values and Rényi divergence when its order is larger than one. Our recent work focuses on reducing the variance by regularizing the function space of the variational formulas. We will present novel families of divergences which enjoy enhanced statistical properties as well as their properties. Those function-space regularized divergences have been tested against a series of ML application including generative adversarial networks, mutual information estimation and rare sub-population detection.