When Is Generalized Bayes Bayesian? A Decision-Theoretic Characterization of Loss-Based Updating

arXiv:2602.01573v1 Announce Type: cross
Abstract: Loss-based updating, including generalized Bayes, Gibbs, and quasi-posteriors, replaces likelihoods by a user-chosen loss and produces a posterior-like distribution via exponential tilt. We give a decision-theoretic characterization that separates emph{belief posteriors} — conditional beliefs justified by the foundations of Savage and Anscombe-Aumann under a joint probability mode l– from emph{decision posteriors} — randomized decision rules justified by preferences over decision rules. We make explicit that a loss-based posterior coincides with ordinary Bayes if and only if the loss is, up to scale and a data-only term, negative log-likelihood. We then show that generalized marginal likelihood is not evidence for decision posteriors, and Bayes factors are not well-defined without additional structure. In the decision posterior regime, non-degenerate posteriors require nonlinear preferences over decision rules. Under sequential coherence and separability, these lead to an entropy-penalized variational representation yielding generalized Bayes as the optimal rule.

Liked Liked