Conference paper Open Access

Training Convolutional Neural Networks with Competitive Hebbian Learning Approaches

Gabriele Lagani; Fabrizio Falchi; Claudio Gennaro; Giuseppe Amato

We explore competitive Hebbian learning strategies to train feature detectors in Convolutional Neural Networks (CNNs), without supervision. We consider variants of the Winner-Takes-All (WTA) strategy explored in previous works, i.e. k-WTA, e-soft-WTA and p-soft-WTA, performing experiments on different object recognition datasets. Results suggest that the Hebbian approaches are effective to train early feature extraction layers, or to re-train higher layers of a pre-trained network, with soft competition generally performing better than other Hebbian approaches explored in this work. Our findings encourage a path of cooperation between neuroscience and computer science towards a deeper investigation of biologically inspired learning principles.

In Nicosia G. et al. (eds) Machine Learning, Optimization, and Data Science. LOD 2021. Lecture Notes in Computer Science, vol 13163. Springer, Cham.
Files (519.7 kB)
Name Size
519.7 kB Download
Views 16
Downloads 25
Data volume 13.0 MB
Unique views 16
Unique downloads 25


Cite as