Separating Oblivious and Adaptive Models of Variable Selection

arXiv:2602.16568v1 Announce Type: cross
Abstract: Sparse recovery is among the most well-studied problems in learning theory and high-dimensional statistics. In this work, we investigate the statistical and computational landscapes of sparse recovery with $ell_infty$ error guarantees. This variant of the problem is motivated by emph{variable selection} tasks, where the goal is to estimate the support of a $k$-sparse signal in $mathbb{R}^d$. Our main contribution is a provable separation between the emph{oblivious} (“for each”) and emph{adaptive} (“for all”) models of $ell_infty$ sparse recovery. We show that under an oblivious model, the optimal $ell_infty$ error is attainable in near-linear time with $approx klog d$ samples, whereas in an adaptive model, $gtrsim k^2$ samples are necessary for any algorithm to achieve this bound. This establishes a surprising contrast with the standard $ell_2$ setting, where $approx k log d$ samples suffice even for adaptive sparse recovery. We conclude with a preliminary examination of a emph{partially-adaptive} model, where we show nontrivial variable selection guarantees are possible with $approx klog d$ measurements.

Liked Liked