Published April 1, 2024 | Version v2
Conference paper Open

Synaptic turnover promotes efficient learning in bio-realistic spiking neural networks

  • 1. Foundation for Research and Technology-Hellas

Description

While artificial machine learning systems achieve superhuman performance in tasks such as language processing and image/video recognition, they do so using extremely large datasets and consume huge amounts of power. On the other hand, the brain remains superior in several cognitively challenging tasks while operating with the energy of a small lightbulb. To explore how biology achieves such high efficiency, we built a biologically constrained spiking neural network model and assessed its learning capacity on various discrimination tasks. We found that a form of structural plasticity, namely the ability of synapses to form and eliminate continuously, results in higher performance accuracy and faster learning across all scenarios tested. These improvements are most significant under more difficult learning conditions, such as when the number of trainable parameters is halved or when the task difficulty is increased. The study highlights the important role of structural plasticity in optimizing learning in biological circuits and opens new avenues for exploring the applicability of such biological plasticity rules in machine learning applications.

Files

Synaptic_turnover_promotes_efficient_learning_in_bio-realistic_spiking_neural_networks.pdf

Additional details

Funding

European Commission
NEUREKA - A smart, hybrid neural-computo device for drug discovery 863245
National Institutes of Health
Experimental and modeling investigations into microcircuit, cellular and subcellular determinants of hippocampal ensemble recruitment to contextual representations 1R01MH124867-01

Software

Programming language
C++ , Python