Effective sample size approximations as entropy measures
arXiv:2602.22954v1 Announce Type: cross
Abstract: In this work, we analyze alternative effective sample size (ESS) metrics for importance sampling algorithms, and discuss a possible extended range of applications. We show the relationship between the ESS expressions used in the literature and two entropy families, the R’enyi and Tsallis entropy. The R’enyi entropy is connected to the Huggins-Roy’s ESS family introduced in cite{Huggins15}. We prove that that all the ESS functions included in the Huggins-Roy’s family fulfill all the desirable theoretical conditions. We analyzed and remark the connections with several other fields, such as the Hill numbers introduced in ecology, the Gini inequality coefficient employed in economics, and the Gini impurity index used mainly in machine learning, to name a few.
Finally, by numerical simulations, we study the performance of different ESS expressions contained in the previous ESS families in terms of approximation of the theoretical ESS definition, and show the application of ESS formulas in a variable selection problem.