Published January 24, 2026 | Version v1
Peer review Open

Emergent Hybrid Computation in Gradient-Free Evolutionary Networks

Description

This paper analyzes the representational structures that emerge in neural networks trained via evolutionary selection rather than gradient-based optimization. While standard deep learning avoids saturation due to the vanishing gradient problem, gradient-free evolution demonstrates that saturation serves as a functional mechanism for discovering hybrid digital-analog representations. We formalize this as a partitioned state space where k saturated neurons establish discrete operational modes, while n-k continuous neurons facilitate fine-grained modulation. Through systematic experimentation across 13 configurations, we empirically validate that saturation emerges when networks must selectively attend to a subset of available inputs. Our results demonstrate that evolution dynamically allocates k based on task demands, achieving k=0 for clean continuous tasks where all inputs are relevant, and k→n when selective filtering becomes necessary

Files

Emergent_Hybrid_Computation_Validated.pdf

Files (238.3 kB)

Name Size Download all
md5:97d46a22c0e1af1c1507a7b0aa4b2b80
238.3 kB Preview Download

Additional details

Software

Repository URL
https://github.com/A1CST/GENREG-sine/tree/main
Programming language
Python
Development Status
Active