A Regularization-Sharpness Tradeoff for Linear Interpolators
arXiv:2602.12680v1 Announce Type: new Abstract: The rule of thumb regarding the relationship between the bias-variance tradeoff and model size plays a key role in classical machine learning, but is now well-known to break down in the overparameterized setting as per the double descent curve. In particular, minimum-norm interpolating estimators can perform well, suggesting the need for new tradeoff in these settings. Accordingly, we propose a regularization-sharpness tradeoff for overparameterized linear regression with an $ell^p$ penalty. Inspired by the […]