Synaptic turnover promotes efficient learning in bio-realistic spiking neural networks
Authors/Creators
- 1. Foundation for Research and Technology-Hellas
Description
While artificial machine learning systems achieve superhuman performance in tasks such as language processing and image/video recognition, they do so using extremely large datasets and consume huge amounts of power. On the other hand, the brain remains superior in several cognitively challenging tasks while operating with the energy of a small lightbulb. To explore how biology achieves such high efficiency, we built a biologically constrained spiking neural network model and assessed its learning capacity on various discrimination tasks. We found that a form of structural plasticity, namely the ability of synapses to form and eliminate continuously, results in higher performance accuracy and faster learning across all scenarios tested. These improvements are most significant under more difficult learning conditions, such as when the number of trainable parameters is halved or when the task difficulty is increased. The study highlights the important role of structural plasticity in optimizing learning in biological circuits and opens new avenues for exploring the applicability of such biological plasticity rules in machine learning applications.
Files
Synaptic_turnover_promotes_efficient_learning_in_bio-realistic_spiking_neural_networks.pdf
Files
(6.6 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:7a1275e1d5ed79d4d753c95976c71f0d
|
6.6 MB | Preview Download |
Additional details
Funding
- European Commission
- NEUREKA - A smart, hybrid neural-computo device for drug discovery 863245
- National Institutes of Health
- Experimental and modeling investigations into microcircuit, cellular and subcellular determinants of hippocampal ensemble recruitment to contextual representations 1R01MH124867-01