Recursive estimation for mixtures or why zooming out is a good idea
Presenter
January 14, 2026
Abstract
Bayesian nonparametric mixture models provide a flexible framework for data analysis but are often hindered by the computational expense of traditional inference methods like MCMC. A fast, recursive algorithm proposed by Newton (2002) offers a practical alternative, yet its formal connection to Bayesian inference and its theoretical properties remain only partially understood. This paper reveals a new geometric interpretation of this classic method. We demonstrate that Newton's recursion is a discrete-time approximation of a gradient flow on the space of probability measures, governed by the Hellinger geometry. This perspective not only provides a principled theoretical foundation for the algorithm but also allows us to generalize it. By framing estimation as the minimization of an energy functional on a statistical manifold, we derive a new family of algorithms by modifying the underlying geometry and discretization. Applications include bootstraping, dependent and repulsive mixtures.