Conference paper Open Access

Training Convolutional Neural Networks with Competitive Hebbian Learning Approaches

Gabriele Lagani; Fabrizio Falchi; Claudio Gennaro; Giuseppe Amato

Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="" xmlns:oai_dc="" xmlns:xsi="" xsi:schemaLocation="">
  <dc:creator>Gabriele Lagani</dc:creator>
  <dc:creator>Fabrizio Falchi</dc:creator>
  <dc:creator>Claudio Gennaro</dc:creator>
  <dc:creator>Giuseppe Amato</dc:creator>
  <dc:description>We explore competitive Hebbian learning strategies to train feature detectors in Convolutional Neural Networks (CNNs), without supervision. We consider variants of the Winner-Takes-All (WTA) strategy explored in previous works, i.e. k-WTA, e-soft-WTA and p-soft-WTA, performing experiments on different object recognition datasets. Results suggest that the Hebbian approaches are effective to train early feature extraction layers, or to re-train higher layers of a pre-trained network, with soft competition generally performing better than other Hebbian approaches explored in this work. Our findings encourage a path of cooperation between neuroscience and computer science towards a deeper investigation of biologically inspired learning principles.</dc:description>
  <dc:description>In Nicosia G. et al. (eds) Machine Learning, Optimization, and Data Science. LOD 2021. Lecture Notes in Computer Science, vol 13163. Springer, Cham.</dc:description>
  <dc:relation>info:eu-repo/grantAgreement/EC/Horizon 2020 Framework Programme - Research and Innovation action/951911/</dc:relation>
  <dc:subject>Neural networks, Machine learning, Hebbian learning Competitive learning, Computer vision,Biologically inspired</dc:subject>
  <dc:title>Training Convolutional Neural Networks with Competitive Hebbian Learning Approaches</dc:title>
Views 27
Downloads 30
Data volume 15.6 MB
Unique views 21
Unique downloads 30


Cite as