Approximate Bayesian inference for cumulative probit regression models

arXiv:2511.06967v2 Announce Type: replace-cross
Abstract: Ordinal categorical data are routinely encountered in many practical applications. When the primary goal is to construct a regression model for ordinal outcomes, cumulative link models represent one of the most popular choices to link the cumulative probabilities of the response with a set of covariates through a parsimonious linear predictor, shared across response categories. As the number of observations grows, standard sampling algorithms for Bayesian inference scale poorly, making posterior computation increasingly challenging for large datasets. In this article, we propose three scalable algorithms for approximating the posterior distribution of the regression coefficients in cumulative probit models relying on Variational Bayes and Expectation Propagation. We compare the proposed approaches with inference based on Markov Chain Monte Carlo, demonstrating superior computational performance and remarkable accuracy. Finally, we illustrate the utility of the proposed algorithms on a challenging case study to investigate the structure of a criminal network.

Liked Liked