A Local Characterization of $f$-Divergences Yielding PSD Mutual-Information Matrices
arXiv:2601.08929v1 Announce Type: new
Abstract: We study when the variable-indexed matrix of pairwise (f)-mutual informations (M^{(f)}_{ij}=I_f(X_i;X_j)) is positive semidefinite (PSD). Let (f:(0,infty)tomathbb{R}) be convex with (f(1)=0), finite in a neighborhood of (1), and with (f(0)<infty) so that diagonal terms are finite. We give a sharp emph{local} characterization around independence: there exists (delta=delta(f)>0) such that for every (n) and every finite-alphabet family ((X_1,ldots,X_n)) whose pairwise joint-to-product ratios lie in ((1-delta,1+delta)), the matrix (M^{(f)}) is PSD if and only if (f) is analytic at (1) with a convergent expansion (f(t)=sum_{m=2}^{infty} a_m (t-1)^m) and (a_mge 0) on a neighborhood of (1). Consequently, any negative Taylor coefficient yields an explicit finite-alphabet counterexample under arbitrarily weak dependence, and non-analytic convex divergences (e.g. total variation) are excluded. This PSD requirement is distinct from Hilbertian/metric properties of divergences between distributions (e.g. (sqrt{mathrm{JS}})): we study PSD of the emph{variable-indexed} mutual-information matrix. The proof combines a replica embedding that turns monomial terms into Gram matrices with a replica-forcing reduction to positive-definite dot-product kernels, enabling an application of the Schoenberg–Berg–Christensen–Ressel classification.