Generalized Bayesian Additive Regression Trees: Theory and Software
arXiv:2304.12505v2 Announce Type: replace-cross
Abstract: Bayesian Additive Regression Trees (BART) are a powerful ensemble learning technique for modeling nonlinear regression functions. Although initially BART was proposed for predicting only continuous and binary response variables, over the years multiple extensions have emerged that are suitable for estimating a wider class of response variables (e.g. categorical and count data) in a multitude of application areas. In this paper we describe a generalized framework for Bayesian trees and their additive ensembles where the response variable comes from an exponential family distribution and hence encompasses many prominent variants of BART. We derive sufficient conditions on the response distribution, under which the posterior concentrates at a minimax rate, up to a logarithmic factor. In this regard our results provide theoretical justification for the empirical success of BART and its variants. To support practitioners, we develop a Python package, also accessible in R via reticulate, that implements GBART for a range of exponential family response variables including Poisson, Inverse Gaussian, and Gamma distributions, alongside the standard continuous regression and binary classification settings. The package provides a user-friendly interface, enabling straightforward implementation of BART models across a broad class of response distributions.