Query Lower Bounds for Diffusion Sampling

arXiv:2604.10857v1 Announce Type: cross
Abstract: Diffusion models generate samples by iteratively querying learned score estimates. A rapidly growing literature focuses on accelerating sampling by minimizing the number of score evaluations, yet the information-theoretic limits of such acceleration remain unclear.
In this work, we establish the first score query lower bounds for diffusion sampling. We prove that for $d$-dimensional distributions, given access to score estimates with polynomial accuracy $varepsilon=d^{-O(1)}$ (in any $L^p$ sense), any sampling algorithm requires $widetilde{Omega}(sqrt{d})$ adaptive score queries. In particular, our proof shows that any sampler must search over $widetilde{Omega}(sqrt{d})$ distinct noise levels, providing a formal explanation for why multiscale noise schedules are necessary in practice.

Liked Liked