Truncated Posterior Inference for Bayesian Nonparametrics
Presenter
January 12, 2026
Abstract
Bayesian nonparametric (BNP) models involve an infinite latent collection of parameters, which enables observations to reflect a growing number of parameters as more data are collected. But while these infinitely many parameters provide significant flexibility, they result in very challenging computational problems for posterior inference methods. One approach involves approximating the nonparametric model with a parametric one (a truncation), and subsequently applying a standard inference algorithm. While this approach is practical, parametric truncation leads to unknown posterior approximation error. In this talk, I will introduce a new technique for posterior inference with general truncated completely random measure priors that includes estimates of the posterior truncation error. Applications include models where feature assignment variables are observed (e.g., edge-exchangeable network models) and unobserved (e.g., latent feature assignment models).