Renormalized Volume as an Information-Capacity Functional for Hyperbolic Generative Models
Description
Hyperbolic representation learning has matured into a substantial subfield over the last
decade, yet it lacks a principled, finite, conformally invariant capacity functional analogous
to the differential entropy of Euclidean information theory. We propose that the renormal-
ized volume Vren(g+) of the conformally compact Einstein (CCE) bulk geometry of a hy-
perbolic generative model serves precisely this role. Building on Anderson's Gauss--Bonnet
identity in dimension four [And01] and the Henningson--Skenderis holographic renormaliza-
tion scheme [HS98], we (i) define Vren rigorously, distinguishing the canonical Fefferman--
Graham scheme from the practical Poincaré-radial scheme used by all popular hyperbolic
embedding methods; (ii) give a sample-based estimator and analyse its bias--variance trade-
off; (iii) formulate three conjectures ---a topological capacity bound (C1), a holographic
mutual-information bound (C2), and a rigidity-based identifiability statement (C3)---and
prove C1 unconditionally in dimension four; (iv) provide a working Python prototype (with
a Julia transcription); and (v) propose five experiments testable on standard benchmarks
(Poincaré WordNet, hyperbolic GCNs on graph datasets, Mip-NeRF unbounded scenes).
The conjectures are open even in cases where the corresponding pure geometric statements
are known, because the empirical / sample-based formulations require additional analytic
work; we identify what is missing.
Files
renormalized_volume_capacity.pdf
Files
(434.1 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:9fdc938af1785c3f49250f9b875c4802
|
434.1 kB | Preview Download |
Additional details
Software
- Programming language
- Julia