Published February 3, 2026 | Version v1
Preprint Open

Geometric Learning Dynamics in Gauge-Regularized Neural Networks: An Experimental Study

  • 1. AHI Governance
  • 2. Sovereign Symbiosis Foundation

Description

Geometric Learning Dynamics in Gauge-Regularized Neural Networks

We present an experimental investigation of **gauge-regularized neural networks** where **synthetic Ricci curvature** and a **dynamically estimated mass gap** are jointly monitored during training. Using a three-dimensional lattice with 125 nodes (5^3), we observe:

1. **Stable convergence** of the task loss from 0.692 to 0.011 over 500 optimization steps.  
2. **Clear separation** between gauge modes (3 eigenvalues λ ≈ 0) and physical modes (λ_min ≈ 1.5–24) in the Hessian spectrum.  
3. **Non-monotonic evolution** of the physical mass gap that exhibits a strong positive correlation (r ≈ 0.78, p < 0.05) with the optimization dynamics.  

The results suggest a **causal structure** in which curvature controls the learning regime, while the loss drives the effective rigidity of the parameter landscape. This work provides quantitative evidence that **geometric structures inspired by Yang–Mills theory** can emerge in discrete neural architectures, opening avenues for practical applications in:

- Anomaly detection  
- Training diagnostics  
- Integration with large language models  

**Keywords:** gauge neural networks, Ricci curvature, mass gap, geometric deep learning, lattice regularization, Hessian spectrum

 
 

Files

Swarm.pdf

Files (219.0 kB)

Name Size Download all
md5:8af84cf68230953a5a886c947b6b144b
219.0 kB Preview Download