Conference paper Open Access

Training Convolutional Neural Networks with Competitive Hebbian Learning Approaches

Gabriele Lagani; Fabrizio Falchi; Claudio Gennaro; Giuseppe Amato


Citation Style Language JSON Export

{
  "DOI": "10.1007/978-3-030-95467-3_2", 
  "language": "eng", 
  "title": "Training Convolutional Neural Networks with Competitive Hebbian Learning Approaches", 
  "issued": {
    "date-parts": [
      [
        2022, 
        2, 
        2
      ]
    ]
  }, 
  "abstract": "<p>We explore competitive Hebbian learning strategies to train feature detectors in Convolutional Neural Networks (CNNs), without supervision. We consider variants of the Winner-Takes-All (WTA) strategy explored in previous works, i.e. k-WTA, e-soft-WTA and p-soft-WTA, performing experiments on different object recognition datasets. Results suggest that the Hebbian approaches are effective to train early feature extraction layers, or to re-train higher layers of a pre-trained network, with soft competition generally performing better than other Hebbian approaches explored in this work. Our findings encourage a path of cooperation between neuroscience and computer science towards a deeper investigation of biologically inspired learning principles.</p>", 
  "author": [
    {
      "family": "Gabriele Lagani"
    }, 
    {
      "family": "Fabrizio Falchi"
    }, 
    {
      "family": "Claudio Gennaro"
    }, 
    {
      "family": "Giuseppe Amato"
    }
  ], 
  "note": "In Nicosia G. et al. (eds) Machine Learning, Optimization, and Data Science. LOD 2021. Lecture Notes in Computer Science, vol 13163. Springer, Cham. https://doi.org/10.1007/978-3-030-95467-3_2", 
  "type": "paper-conference", 
  "id": "6367135"
}
27
30
views
downloads
Views 27
Downloads 30
Data volume 15.6 MB
Unique views 21
Unique downloads 30

Share

Cite as