EPANG-Gen: A Robust Curvature-Aware Optimizerwith Uncertainty Quantification for Scientific Machine Learning

Physics-informed neural networks (PINNs) have emerged as powerful tools for solving partial differential equations, but their training remains challenging due to ill-conditioned loss landscapes. While adaptive methods like Adam dominate deep learning, they exhibit instability on stiff PDEs, and second-order methods are com- putationally prohibitive. We present EPANG-Gen (Enhanced Physics-Aware Natural Gradient with Generalization), a novel optimizer that combines memory-efficient eigen- decomposition with lightweight Bayesian uncertainty quantification. EPANG-Gen in- troduces three key innovations: (1) a randomized eigenspace estimator that approx- imates Hessian curvature with O(dk) memory (k ≪ d), (2) Bayesian R-LayerNorm for per-activation uncertainty estimation, and (3) adaptive rank selection (PASA) that dynamically adjusts to problem difficulty. We evaluate EPANG-Gen on four bench- mark PDEs—Poisson 1D, Burgers’ equation, Darcy flow, and Helmholtz 2D—and on the challenging Taylor-Green vortex at Re = 100, 000, a canonical 3D turbulence problem. Results show that EPANG-Gen matches Adam’s performance on the toughest turbulent regime while eliminating the 25% catastrophic failure rate of ADOPT across 72 runs. Ablation studies confirm that eigen-preconditioning improves performance by 11–35%. The built-in uncertainty estimates provide actionable confidence metrics at negligible cost. EPANG-Gen represents the first optimizer specifically designed for geo- metric and physical AI that combines theoretical convergence guarantees with practical robustness for safety-critical applications.

Liked Liked