Model Selection and Parameter Estimation of One-Dimensional Gaussian Mixture Models
arXiv:2404.12613v3 Announce Type: replace
Abstract: In this paper, we study the problem of learning one-dimensional Gaussian mixture models (GMMs) with a specific focus on estimating both the model order and the mixing distribution from independent and identically distributed (i.i.d.) samples. This paper establishes the optimal sampling complexity for model order estimation in one-dimensional Gaussian mixture models. We prove a fundamental lower bound on the number of samples required to correctly identify the number of components with high probability, showing that this limit depends critically on the separation between component means and the total number of components.
We then propose a Fourier-based approach to estimate both the model order and the mixing distribution. Our algorithm utilizes Fourier measurements constructed from the samples, and our analysis demonstrates that its sample complexity matches the established lower bound, thereby confirming its optimality. Numerical experiments further show that our method outperforms conventional techniques in terms of efficiency and accuracy.