Published February 2, 2022 | Version v1
Conference paper Open

Evaluating Hebbian Learning in a Semi-supervised Setting

  • 1. University of Pisa
  • 2. CNR-ISTI


We propose a semi-supervised learning strategy for deep Convolutional Neural Networks (CNNs) in which an unsupervised pre-training stage, performed using biologically inspired Hebbian learning algorithms, is followed by supervised end-to-end backprop fine-tuning. We explored two Hebbian learning rules for the unsupervised pre-training stage: soft-Winner-Takes-All (soft-WTA) and nonlinear Hebbian Principal Component Analysis (HPCA). Our approach was applied in sample efficiency scenarios, where the amount of available labeled training samples is very limited, and unsupervised pre-training is therefore beneficial. We performed experiments on CIFAR10, CIFAR100, and Tiny ImageNet datasets. Our results show that Hebbian outperforms Variational Auto-Encoder (VAE) pre-training in almost all the cases, with HPCA generally performing better than soft-WTA.


In: Nicosia G. et al. (eds) Machine Learning, Optimization, and Data Science. LOD 2021. Lecture Notes in Computer Science, vol 13163. Springer, Cham.



Files (433.6 kB)

Name Size Download all
433.6 kB Preview Download

Additional details


AI4Media – A European Excellence Centre for Media, Society and Democracy 951911
European Commission
AI4EU – A European AI On Demand Platform and Ecosystem 825619
European Commission