Videos

From Dynamics to Algorithms for Optimization and Sampling

Presenter
June 24, 2024
Abstract
Optimization and sampling are fundamental algorithmic tasks in machine learning and in many applications. In this talk, we will survey recent results in the design and analysis of algorithms for optimization and sampling via a continuous-time perspective, and via the perspective of sampling as optimization in the space of distributions. Many algorithms in discrete time can be derived as discretization of continuous-time dynamics; for example, gradient descent is a discretization of gradient flow, and momentum method is a discretization of accelerated gradient flow. This perspective provides a systematic way to design algorithms via discretizing continuous-time dynamics, and to derive convergence guarantees via translating the continuous-time properties. In sampling, many random walks or Markov chain methods can be viewed as optimization algorithms in the space of probability distributions; for example, the Langevin dynamics is implementing the gradient flow for minimizing relative entropy. This perspective provides a systematic way to analyze the mixing times of Markov chains via translating the optimization techniques, and to design new sampling algorithms via translating the optimization principles to the space of distributions.