Minimax learning rates for estimating binary classifiers under margin conditions

arXiv:2505.10628v2 Announce Type: replace
Abstract: We study classification problems using binary estimators where the decision boundary is described by horizon functions and where the data distribution satisfies a geometric margin condition. A key novelty of our work is the derivation of lower bounds for the worst-case learning rates over broad classes of functions, under a geometric margin condition — a setting that is almost universally satisfied in practice, but remains theoretically challenging. Moreover, we work in the noiseless setting, where lower bounds are particularly hard to establish. Our general results cover, in particular, classification problems with decision boundaries belonging to several classes of functions: for Barron-regular functions, H”older-continuous functions, and convex-Lipschitz functions with strong margins, we identify optimal rates close to the fast learning rates of $mathcal{O}(n^{-1})$ for $n in mathbb{N}$ samples.

Liked Liked