Sequential Monte Carlo approximations of Wasserstein–Fisher–Rao gradient flows

arXiv:2506.05905v2 Announce Type: replace-cross
Abstract: We consider the problem of sampling from a probability distribution $pi$. It is well known that this can be written as an optimisation problem over the space of probability distribution in which we aim to minimise the Kullback–Leibler divergence from $pi$. We consider several partial differential equations (PDEs) whose solution is a minimiser of the Kullback–Leibler divergence from $pi$ and connect them to well-known Monte Carlo algorithms. We focus in particular on PDEs obtained by considering the Wasserstein–Fisher–Rao geometry over the space of probabilities and show that these lead to a natural implementation using importance sampling and sequential Monte Carlo. We propose a novel algorithm to approximate the Wasserstein–Fisher–Rao flow of the Kullback–Leibler divergence and conduct an extensive empirical study to identify when these algorithms outperforms other popular Monte Carlo algorithms.

Liked Liked