Building faster and more expressive BART models
Presenter
January 14, 2026
Abstract
Bayesian Additive Regression Trees (BART) is a highly effective nonparametric regression model that approximates unknown functions with a sum of axis-aligned binary regression trees (i.e., piecewise-constant step functions) that one-hot encode categorical predictors. Consequently, the original BART model is fundamentally limited in its ability to (i) "borrow strength" across multiple levels of a categorical predictor and (ii) exploit structural relationships between multiple categorical predictors (e.g., adjacency and nesting). I will introduce new decision rule priors that overcome these limitations and open the door to fitting non-linear multilevel models with regression tree ensembles. I will also describe a new software package that unifies several existing BART extensions and allows users to fit increasingly expressive BART models without having to implement bespoke samplers.