Complex Interpolation of Matrices with an application to Multi-Manifold Learning

Given two symmetric positive-definite matrices $A, B in mathbb{R}^{n times n}$, we study the spectral properties of the interpolation $A^{1-x} B^x$ for $0 leq x leq 1$. The presence of `common structures’ in $A$ and $B$, eigenvectors pointing in a similar direction, can be investigated using this interpolation perspective. Generically, exact log-linearity of the operator norm $|A^{1-x} B^x|$ is equivalent to the existence of a shared eigenvector in the original matrices; stability bounds show that approximate log-linearity forces principal singular vectors to align with leading eigenvectors of both matrices. These results give rise to and provide theoretical justification for a multi-manifold learning framework that identifies common and distinct latent structures in multiview data.

Liked Liked