Conference paper Open Access

Training Convolutional Neural Networks with Competitive Hebbian Learning Approaches

Gabriele Lagani; Fabrizio Falchi; Claudio Gennaro; Giuseppe Amato

We explore competitive Hebbian learning strategies to train feature detectors in Convolutional Neural Networks (CNNs), without supervision. We consider variants of the Winner-Takes-All (WTA) strategy explored in previous works, i.e. k-WTA, e-soft-WTA and p-soft-WTA, performing experiments on different object recognition datasets. Results suggest that the Hebbian approaches are effective to train early feature extraction layers, or to re-train higher layers of a pre-trained network, with soft competition generally performing better than other Hebbian approaches explored in this work. Our findings encourage a path of cooperation between neuroscience and computer science towards a deeper investigation of biologically inspired learning principles.

In Nicosia G. et al. (eds) Machine Learning, Optimization, and Data Science. LOD 2021. Lecture Notes in Computer Science, vol 13163. Springer, Cham. https://doi.org/10.1007/978-3-030-95467-3_2
Files (519.7 kB)
Name Size
ACAIN2021_paper_20.pdf
md5:0e518291c88a6e2eb41f885e3885ba3b
519.7 kB Download
16
25
views
downloads
Views 16
Downloads 25
Data volume 13.0 MB
Unique views 16
Unique downloads 25

Share

Cite as