V4 Grow-Only Architecture: Eliminating Catastrophic Forgetting Through Additive Neural Growth
Authors/Creators
Description
Catastrophic forgetting remains a fundamental challenge in neural network training, where learning new information degrades performance on previously learned tasks. We present V4 of grow-only architecture that eliminates catastrophic forgetting through strictly additive neural growth - neurons are added but never removed or modified destructively. Combined with minimal LoRA adaptation (r=16, 2 modules, 2.18M parameters), our approach achieves 76% accuracy on ARC-Easy (vs 32.5% baseline), 49.5% on ARC-Challenge, and 35.5% on HellaSwag. Crucially, V4 maintains 100% prior task performance during domain shifts, compared to 8.3% degradation in V3 architectures. Over 97K training steps, the model grew by only 504 neurons (+0.27% parameters), demonstrating that effective continual learning requires minimal architectural overhead. Our results suggest that the forgetting problem in neural networks can be addressed through growth constraints rather than complex replay mechanisms or regularization schemes.
Files
PAPER_1_NEURAL_GROWTH.md
Files
(21.6 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:8eeaee4ad6584fbe0a1fb1676b91b6ee
|
21.6 kB | Preview Download |
Additional details
Related works
- Cites
- Publication: https://doi.org/10.5281/zenodo.17970543 (URL)
Dates
- Other
-
2026-02-03
Software
- Repository URL
- https://github.com/spartan8806
- Development Status
- Active