Varying-Coefficient Mixture of Experts Model
arXiv:2601.01699v1 Announce Type: cross Abstract: Mixture-of-Experts (MoE) is a flexible framework that combines multiple specialized submodels (“experts”), by assigning covariate-dependent weights (“gating functions”) to each expert, and have been commonly used for analyzing heterogeneous data. Existing statistical MoE formulations typically assume constant coefficients, for covariate effects within the expert or gating models, which can be inadequate for longitudinal, spatial, or other dynamic settings where covariate influences and latent subpopulation structure evolve across a known dimension. We propose a […]