Low-Rank Approximation by Randomly Pivoted LU

arXiv:2601.22344v1 Announce Type: new
Abstract: The low-rank approximation properties of Randomly Pivoted LU (RPLU), a variant of Gaussian elimination where pivots are sampled proportional to the squared entries of the Schur complement, are analyzed. It is shown that the RPLU iterates converge geometrically in expectation for matrices with rapidly decaying singular values. RPLU outperforms existing low-rank approximation algorithms in two settings: first, when memory is limited, RPLU can be implemented with $mathcal{O}(k^2 + m + n)$ storage and $mathcal{O}( k(m + n)+ kmathcal{M}(mat{A}) + k^3)$ operations, where $mathcal{M}(mat{A})$ is the cost of a matvec with $mat{A}inmathbb{C}^{ntimes m}$ or its adjoint, for a rank-$k$ approximation. Second, when the matrix and its Schur complements share exploitable structure, such as for Cauchy-like matrices. The efficacy of RPLU is illustrated with several examples, including applications in rational approximation and solving large linear systems on GPUs.

Liked Liked