Complex-Valued Probability Measures and Their Applications in Information Theory
arXiv:2603.12297v1 Announce Type: new
Abstract: This paper introduces a comprehensive framework for complex-valued probability measures and explores their novel applications in information theory and statistical analysis. We define a complex probability measure as a phase-modulated extension of a classical probability measure. Building upon this foundation, we propose three fundamental information-theoretic quantities: complex entropy, which quantifies distribution uniformity through phase coherence; complex divergence, an asymmetric measure of dissimilarity between distributions; and the complex metric, a symmetric distance function satisfying the triangle inequality. We establish these concepts rigorously for both continuous and discrete probability distributions, proving key properties such as boundedness, continuity under total variation convergence, and clear extremal behaviors. A detailed comparative analysis with classical measures (Shannon entropy and Kullback-Leibler divergence) highlights the unique geometric and interpretive advantages of the proposed framework, particularly its sensitivity to distributional shape via a tunable phase parameter. We elucidate a profound formal analogy between the complex entropy integral and Feynman’s path integral formulation of quantum mechanics, suggesting a deeper conceptual bridge. Finally, we demonstrate the practical utility of the complex metric through a detailed application in nonparametric two-sample hypothesis testing, outlining the testing procedure, advantages, limitations, and providing a conceptual simulation. This work opens new avenues for analyzing probability distributions through the lens of complex analysis and interference phenomena, with potential impacts across information theory, statistical inference, and machine learning.