Videos

Stochastic Optimization Algorithms with Heavy Tailed Input

Presenter
September 13, 2024
Abstract
We first explain why statistical analysis of stochastic optimization algorithms with heavy-tailed input arises naturally in applications. In fact, we will argue that models that assume infinite variance gradient estimators in stochastic gradient descent are appropriate depending on easy-to-monitor features of historical data and on the spatial and temporal scales over which the algorithm will be deployed (even if models have finite variance in theory). We will then discuss inference tools that can be applied to monitor convergence of stochastic optimization algorithms based on several asymptotic statistics. The results that we will present include the first weak convergence analysis of stochastic gradient descent with infinite variance, extending results which assume finite variance or homogeneous and additive gradient noise. Based on joint work with Aleks Mijatovic, Wenhao Yang, Joost Jorritsma, Bert Zwart