Minimax Rates for Hyperbolic Hierarchical Learning
arXiv:2601.20047v1 Announce Type: new Abstract: We prove an exponential separation in sample complexity between Euclidean and hyperbolic representations for learning on hierarchical data under standard Lipschitz regularization. For depth-$R$ hierarchies with branching factor $m$, we first establish a geometric obstruction for Euclidean space: any bounded-radius embedding forces volumetric collapse, mapping exponentially many tree-distant points to nearby locations. This necessitates Lipschitz constants scaling as $exp(Omega(R))$ to realize even simple hierarchical targets, yielding exponential sample complexity under capacity control. We […]