Refining Covariance Matrix Estimation in Stochastic Gradient Descent Through Bias Reduction
arXiv:2604.21203v1 Announce Type: new
Abstract: We study online inference and asymptotic covariance estimation for the stochastic gradient descent (SGD) algorithm. While classical methods (such as plug-in and batch-means estimators) are available, they either require inaccessible second-order (Hessian) information or suffer from slow convergence. To address these challenges, we propose a novel, fully online de-biased covariance estimator that eliminates the need for second-order derivatives while significantly improving estimation accuracy. Our method employs a bias-reduction technique to achieve a convergence rate of $n^{(alpha-1)/2} sqrt{log n}$, outperforming existing Hessian-free alternatives.
Like
0
Liked
Liked