Fast non-reversible samplers for Bayesian mixture models
Presenter
January 14, 2026
Abstract
Finite and infinite mixtures are a cornerstone of Bayesian modelling, and it is well-known that sampling from the resulting posterior distribution can be a hard task. In particular, popular reversible Markov chain Monte Carlo schemes are often slow to converge when the number of observations is large. In this paper we introduce a novel and simple non-reversible sampling scheme for Bayesian mixture models, which is shown to drastically outperform classical samplers in many scenarios of interest, especially during convergence phase and when components in the mixture have non-negligible overlap.
At the theoretical level, we show that the performance of the proposed non-reversible scheme cannot be worse than the standard one, in terms of asymptotic variance, by more than a constant factor, and we provide a scaling limit analysis suggesting that the non-reversible sampler can reduce the convergence time by an order of magnitude. We also discuss why the statistical features of mixture models make them an ideal case for the use of non-reversible discrete samplers.