Anytime valid and asymptotically optimal statistical inference driven by predictive recursion
Presenter
January 15, 2026
Abstract
Distinguishing two classes of candidate models is a fundamental and practically important problem in statistical inference. Error rate control is crucial to the logic but, in complex nonparametric settings, such guarantees can be difficult to achieve, especially when the stopping rule that determines the data collection process is not available. My talk is based on construction of e-processes in a Bayesian and quasi-Bayesian setting. A particular novel e-process construction by us leverages the so-called predictive recursion (PR) algorithm. The proposal is based on constructing a marginal likelihood by mixing over a specified class of distributions. Such a likelihood could be constructed in a Bayesian way by introducing a prior on the class of distributions in the alternative and finding the corresponding Bayesian marginal likelihood. But implementing a purely Bayesian strategy to account for nonparametric aspects of applications can be computationally demanding. The PR algorithm stems as an approximation to the posterior mean of the mixing distribution under the Dirichlet process prior and hence is able to rapidly and recursively fit nonparametric mixture models. The resulting PRe-process affords anytime valid inference uniformly over stopping rules and is shown to be efficient in the sense that it achieves the maximal growth rate under the alternative relative to the mixture model being fit by PR.