There is a newer version of the record available.

Published March 11, 2026 | Version 6.3.0
Preprint Open

Dynamic Curvature Adaptation: A Unified Geometric Theory of Cortical State and Pathological Collapse

Description

Abstract

We propose the Curvature Adaptation Hypothesis (CAH): nervous systems optimize task-dependent information transport by dynamically unlocking the latent hyperbolic geometry inherent in their hierarchical structure. We identify a plausible biophysical actuator—the Martinotti-cell subtype of Somatostatin (SST) interneurons—which targets distal apical dendrites to regulate the apical-somatic conductance ratio (γ), to serve as a geometric switch. Using Optimal Transport (OT) simulation and finite-size scaling analysis, we demonstrate that modulating (γ) drives the network through a sharp, non-linear phase transition from a stable Euclidean regime (κ ≈ 0) to a deep hyperbolic regime (κ < 0). Crucially, this transition is scale-invariant across network depths (N=3,5,7), reflecting the fractal nature of the underlying tree topology. We show that this geometric transition is remarkably robust to degree-preserving topological scrambling. This indicates that the hyperbolic regime is driven by local synaptic availability rather than global architectural order. To validate this optimal transport mathematics in biologically realistic wetware, we implement a Spiking Neural Network (SNN) simulation of the VIP-SST-Pyramidal microcircuit, demonstrating that VIP-mediated disinhibition natively induces the highly synchronized 40Hz gamma-band pillars required to initiate the macroscopic hyperbolic plunge. Furthermore, we simulate two distinct pathological states: (1) Hyper-Integration, where the introduction of VIP-like hub nodes abolishes the flat regime, forcing the network into permanent hyperbolicity (analogous to the hyper-associative phenotypes observed in mania or psychosis); and (2) Geometric Collapse, where synaptic pruning (>30% spine loss) prevents the network from accessing deep negative curvature even under high coupling (analogous to the structural synaptic loss observed in neurodegenerative disorders like Alzheimer's disease). These results suggest that geometry is not a static anatomical feature but a dynamic functional state, actively tuned by the interplay of inhibitory interneurons. We provide a reproducible simulation protocol and a proposed optogenetic experiment in Macaque PFC to make the Curvature Adaptation Hypothesis (CAH) empirically falsifiable.

 

Summary

This project introduces the Curvature Adaptation Hypothesis (CAH), a novel theoretical framework proposing that the mammalian cortex optimizes information transport by dynamically "warping" its functional manifold into hyperbolic regimes. We identify the Martinotti-cell subtype of Somatostatin (SST) interneurons—which target distal apical dendrites— as the biological actuator for this geometric switch, regulating the apical-somatic conductance ratio (γ) to trigger a non-linear phase transition into negative curvature (κ<0).

Key Findings

  • Scale-Invariant Phase Transition: Using Optimal Transport (OT) simulations on stochastic Galton-Watson trees, we demonstrate that the transition from Euclidean to Hyperbolic geometry is a scale-invariant property of hierarchical networks.
  • Topological Robustness: Our results reveal that the capacity for hyperbolic depth is robust to degree-preserving topological scrambling, suggesting that local synaptic density, rather than global architectural perfection, is the primary driver of the geometric manifold.
  • Spiking Neural Network Validation: A PyNEST simulation of the VIP-SST-Pyramidal microcircuit, establishing the dynamical biological framework for the geometric switch.
  • The Geometric Trilogy of Disease: We computationally model three distinct cognitive states:
    1. Healthy Regime: Tunable phase transition between flat and hyperbolic manifolds.
    2. Manic Regime: "Geometric Inelasticity" caused by high-centrality VIP-like hub nodes, forcing the system into permanent hyperbolicity.
    3. Neurodegenerative Regime: "Geometric Collapse" caused by synaptic pruning (>30% loss), preventing the manifold from accessing the depth required for complex hierarchical integration.

Metabolic Implications

We provide a biophysical energy ROI analysis demonstrating that the local maintenance "tax" required for SST-gating is offset by a global "signaling tax haven," where hyperbolic geodesics minimize the metabolic cost of hierarchical inference.

Related Works

Repository Contents

The python scripts are included here, but you may also find them at:
https://github.com/MPender08/dendritic-curvature-adaptation

Manuscript: Full pre-print detailing the mathematical derivation and biophysical mechanism.
Simulation Suite: Python-based implementation (NetworkX, POT) including:

The script energy_ROI_tracker.py depends on the physics engine in run_CAH_scaling_analysis.py. Please ensure both files are downloaded to the same directory before running.

Note: NEST is required to run the PyNEST simulation.

 

pip install networkx numpy matplotlib pot tqdm joblib scipy

 

python run_CAH_scaling_analysis.py: Finite-size scaling and robustness tests. 

python run_CAH_with_Hubs.py: Simulation of hyper-integrative/manic states.
 
python run_CAH_Pruning.py: Simulation of neurodegenerative collapse.
 
python energy_ROI_tracker.py: Metabolic expenditure modeling.
 
python biological_manifold.py: PyNEST SNN simulation.

Notes

Changelog 

Version 6.3.0 includes:

  1. Theoretical Bridging (Section 2.1): Added explicit biophysical justification following Equation 1, directly mapping the continuous cable theory of apical-somatic conductance (γ) to the discrete graph theory of informational random walks. Anchored with established Martinotti-shunting and apical-integration literature.

  2. Thermodynamic Grounding (Section 2.2): Fully integrated the physics of the Landauer Limit (E=k_B T ln(2)) into the core theoretical framework. 

  3. Expanded Literature Context (Introduction): Broadened the foundational bibliography to securely anchor the framework within mainstream network neuroscience (Bassett) and geometric deep learning (Nickel).

 

Files

dynamic_curvature_adaptation_v6.3.0.pdf

Files (2.2 MB)

Name Size Download all
md5:4c52ea0bfaf5f067accb583c3750130d
302.7 kB Preview Download
md5:406ff567a911aec9510dee8a72f1ef44
2.7 kB Download
md5:042104515ac94f383a2fa9149f56c52c
174.8 kB Preview Download
md5:7b3caf5b82e704c678c65eef47958038
166.0 kB Preview Download
md5:85dbb147bbae327106ac5440108cabd5
1.2 MB Preview Download
md5:d539d5e2dbae28ff852d1423b2297a8d
35.1 kB Download
md5:a1dc2911ba348407220f47ad64895b1c
5.3 kB Download
md5:11440557badc17041a4e79895f85d081
207.6 kB Preview Download
md5:40fc6f75888dbdc4f62c5211e4ad47e5
75.2 kB Preview Download
md5:d33379272054771b11cb151daa1644a0
1.2 kB Preview Download
md5:4c0752e6d2a7c064cc99d38c3912ef81
6.6 kB Download
md5:6a74519c70c58e01631fdca1756fadcc
111 Bytes Preview Download
md5:3d27e18a888479b27c2648146fcf67b9
7.3 kB Download
md5:c836eb8bf41782b17631aec30b5232ac
8.6 kB Download
md5:5ed14652d3043ba8ab6d28fb88a17edc
6.9 kB Download

Additional details

Related works

Is supplement to
Preprint: 10.5281/zenodo.18913772 (DOI)
Is supplemented by
Software: https://github.com/MPender08/dendritic-curvature-adaptation (URL)
Preprint: 10.5281/zenodo.18655523 (DOI)
Preprint: 10.5281/zenodo.18717807 (DOI)
Preprint: 10.5281/zenodo.18627785 (DOI)

Software

Repository URL
https://github.com/MPender08/dendritic-curvature-adaptation
Programming language
Python
Development Status
Active

References

  • Attwell, D., & Laughlin, S. B. (2001). An energy budget for signaling in the grey matter of the brain. Journal of Cerebral Blood Flow & Metabolism, 21(10), 1133–1145.
  • Krioukov, D., Papadopoulos, F., Kitsak, M., Vahdat, A., & Boguñá, M. (2010). Hyperbolic geometry of complex networks. Physical Review E, 82(3), 036106.
  • Larkum, M. (2013). A cellular mechanism for cortical associations: an organizing principle for the cerebral cortex. Trends in Neurosciences, 36(3), 141–151.
  • Merino-Serrais, P., Benavides-Piccione, R., Kastanauskaite, A., & DeFelipe, J. (2021). Dendritic spines are lost in clusters in Alzheimer's disease. Scientific Reports, 11(1), 12349.
  • Ollivier, Y. (2009). Ricci curvature of Markov chains on metric spaces. Journal of Functional Analysis, 256(3), 810–864.
  • Pender, M. A. (2026). The Metabolic Phase Transition: Qualia as a Topological Solution to the Landauer Limit in High-Dimensional Manifolds. Zenodo. https://doi.org/10.5281/zenodo.18655523
  • Urban-Ciecko, J., & Barth, A. L. (2016). Somatostatin-expressing neurons in cortical networks. Nature Reviews Neuroscience, 17(7), 401–409.
  • Zhang, C.-L., Sontag, L., Gómez-Ocádiz, R., & Schmidt-Hieber, C. (2024). Learning-dependent gating of hippocampal inputs by frontal interneurons. Proceedings of the National Academy of Sciences, 121(45), e2403325121. https://doi.org/10.1073/pnas.2403325121
  • Beggs, J. M., & Plenz, D. (2003). Neuronal avalanches in neocortical circuits. Journal of Neuroscience, 23(35), 11167–11177.
  • Amari, S. (2016). Information Geometry and Its Applications (Vol. 194). Springer.
  • Murayama, M., Pérez-Garci, E., Nevian, T., Bock, T., Senn, W., & Larkum, M. E. (2009). Dendritic encoding of sensory stimuli controlled by deep cortical interneurons. Nature, 457(7233), 1137–1141.
  • Palmigiano, A., Geisel, T., Wolf, F., & Battaglia, D. (2017). Flexible information routing by transient synchrony. Nature Neuroscience, 20(7), 1014–1022.
  • Landauer, R. (1961). Irreversibility and heat generation in the computing process. IBM Journal of Research and Development, 5(3), 183–191.
  • Li, X. (2026). On the dynamics of observation and semantics. arXiv preprint arXiv:2602.18494.
  • Pender, M. A. (2026). The Manifold Chip: Silicon Architecture for Dynamic Curvature Adaptation via Dual-Gated Analog Shunting (3.1.0). Zenodo. https://doi.org/10.5281/zenodo.18779779
  • Pender, M. A. (2026). Geometry-Aware Plasticity: Thermodynamic Weight Updates in Non-Euclidean Hardware (1.0.1). Zenodo. https://doi.org/10.5281/zenodo.18762339
  • R´emi Flamary, Nicolas Courty, Alexandre Gramfort, Mokhtar Z Alaya, Aur´elie Boisbunon, Stanislas Chambon, Laetitia Chapel, Adrien Corenflos, Kilian Fatras, Nemo Fournier, et al. POT: Python optimal transport. Journal of Machine Learning Research, 22(78):1–8, 2021.
  • Marc-Oliver Gewaltig and Markus Diesmann. NEST (NEural simulation tool). Scholarpe- dia, 2(4):1430, 2007.
  • Aric A. Hagberg, Daniel A. Schult, and Pieter J. Swart. Exploring network structure, dynamics, and function using NetworkX. In Proceedings of the 7th Python in Science Conference, pages 11–15. Pasadena, CA USA, 2008.
  • Bassett, D. S., & Sporns, O. (2017). Network neuroscience. Nature Neuroscience, 20(3), 353–364.
  • Nickel, M., & Kiela, D. (2017). Poincaré embeddings for learning hierarchical representations. Advances in Neural Information Processing Systems, 30.