Synaptic turnover promotes efficient learning in bio-realistic spiking neural networks
Authors/Creators
- 1. Foundation for Research and Technology-Hellas
Description
While artificial machine learning systems achieve superhuman performance in specific tasks such as language processing, image and video recognition, they do so use extremely large datasets and huge amounts of power. On the other hand, the brain remains superior in several cognitively challenging tasks while operating with the energy of a small lightbulb. We use a biologically constrained spiking neural network model to explore how the neural tissue achieves such high efficiency and assess its learning capacity on discrimination tasks. We found that synaptic turnover, a form of structural plasticity, which is the ability of the brain to form and eliminate synapses continuously, increases both the speed and the performance of our network on all tasks tested. Moreover, it allows accurate learning using a smaller number of examples. Importantly, these improvements are most significant under conditions of resource scarcity, such as when the number of trainable parameters is halved and when the task difficulty is increased. Our findings provide new insights into the mechanisms that underlie efficient learning in the brain and can inspire the development of more efficient and flexible machine learning algorithms.
Files
Malakasis_et_al_2023_version1.pdf
Files
(1.6 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:a309aadcb3b74e1aff3d7f0d71882c2f
|
1.6 MB | Preview Download |
Additional details
Funding
- European Commission
- NEUREKA - A smart, hybrid neural-computo device for drug discovery 863245
- National Institutes of Health
- Experimental and modeling investigations into microcircuit, cellular and subcellular determinants of hippocampal ensemble recruitment to contextual representations 1R01MH124867-01