CHLU: The Causal Hamiltonian Learning Unit as a Symplectic Primitive for Deep Learning
Current deep learning primitives dealing with temporal dynamics suffer from a fundamental dichotomy: they are either discrete and unstable (LSTMs) citep{pascanu_difficulty_2013}, leading to exploding or vanishing gradients; or they are continuous and dissipative (Neural ODEs) citep{dupont_augmented_2019}, which destroy information over time to ensure stability. We propose the textbf{Causal Hamiltonian Learning Unit} (pronounced: textit{clue}), a novel Physics-grounded computational learning primitive. By enforcing a Relativistic Hamiltonian structure and utilizing symplectic integration, a CHLU strictly conserves phase-space volume, as an attempt to solve the memory-stability trade-off. We show that the CHLU is designed for infinite-horizon stability, as well as controllable noise filtering. We then demonstrate a CHLU’s generative ability using the MNIST dataset as a proof-of-principle.