Geometric Physics-Aware Gradient Descent (GPAGD): A Lightweight Optimizer Integrating Manifold Geometry, PDE Residuals and Local Uncertainty

Standard gradient descent optimizers treat the parameter space as Euclidean and ignore the underlying geometry of data or the physical constraints that solutions must satisfy. This paper introduces Geometric Physics Aware Gradient Descent (GPAGD), a novel first order optimizer that incorporates three inductive biases: (i) a manifold projection that aligns gradients with the tangent space of the input coordinates, (ii) a physics gate that scales the step size by the exponential of the PDE residual, and (iii) an uncertainty gate that uses local entropy to dampen steps in noisy regions. A capacity scaler automatically adjusts the learning rate to the dataset size and global noise level. GPAGD is a drop in replacement for Adam in physics informed neural networks (PINNs) and adds negligible computational overhead. We evaluate GPAGD on four PDE benchmarks – Poisson 1D, Burgers 1D, Darcy 2D, and Taylor Green 2D – using 3 000 epochs and 3 random seeds on a Colab T4 GPU. On the elliptic Darcy 2D problem, GPAGD achieves a relative L2 error of 1.0002 ± 0.0002, outperforming Adam (2.988 ± 0.198) and L BFGS (36.57 ± 16.65) with a paired t test p value of 0.0049. On Poisson 1D, GPAGD reduces the error from 19.95 (Adam) to 10.25 (p = 0.0515, marginal). On Burgers and Taylor Green, GPAGD performs comparably to Adam (errors within 12%). An ablation study (not shown due to compute limits) previously confirmed that the physics gate is critical for the improvement on Darcy. The code, convergence plots, and results are publicly available. GPAGD offers a principled, lightweight way to inject geometry, physics, and uncertainty into gradient descent, with significant gains on elliptic PDEs.

Liked Liked