DS FedProxGrad: Asymptotic Stationarity Without Noise Floor in Fair Federated Learning
arXiv:2512.08671v4 Announce Type: replace-cross
Abstract: Recent work cite{arifgroup} introduced Federated Proximal Gradient textbf{(texttt{FedProxGrad})} for solving non-convex composite optimization problems in group fair federated learning. However, the original analysis established convergence only to a textit{noise-dominated neighborhood of stationarity}, with explicit dependence on a variance-induced noise floor. In this work, we provide an improved asymptotic convergence analysis for a generalized texttt{FedProxGrad}-type analytical framework with inexact local proximal solutions and explicit fairness regularization. We call this extended analytical framework textbf{DS texttt{FedProxGrad}} (Decay Step Size texttt{FedProxGrad}). Under a Robbins-Monro step-size schedule cite{robbins1951stochastic} and a mild decay condition on local inexactness, we prove that $liminf_{rtoinfty} mathbb{E}[|nabla F(mathbf{x}^r)|^2] = 0$, i.e., the algorithm is asymptotically stationary and the convergence rate does not depend on a variance-induced noise floor.