Published February 2, 2022 | Version v1
Conference paper Open

Evaluating Hebbian Learning in a Semi-supervised Setting

  • 1. University of Pisa
  • 2. CNR-ISTI

Description

We propose a semi-supervised learning strategy for deep Convolutional Neural Networks (CNNs) in which an unsupervised pre-training stage, performed using biologically inspired Hebbian learning algorithms, is followed by supervised end-to-end backprop fine-tuning. We explored two Hebbian learning rules for the unsupervised pre-training stage: soft-Winner-Takes-All (soft-WTA) and nonlinear Hebbian Principal Component Analysis (HPCA). Our approach was applied in sample efficiency scenarios, where the amount of available labeled training samples is very limited, and unsupervised pre-training is therefore beneficial. We performed experiments on CIFAR10, CIFAR100, and Tiny ImageNet datasets. Our results show that Hebbian outperforms Variational Auto-Encoder (VAE) pre-training in almost all the cases, with HPCA generally performing better than soft-WTA.

Notes

In: Nicosia G. et al. (eds) Machine Learning, Optimization, and Data Science. LOD 2021. Lecture Notes in Computer Science, vol 13163. Springer, Cham. https://doi.org/10.1007/978-3-030-95467-3_2

Files

LOD_2021_paper_152.pdf

Files (433.6 kB)

Name Size Download all
md5:af1fb588a871a8677e7eb624703ebf24
433.6 kB Preview Download

Additional details

Funding

AI4Media – A European Excellence Centre for Media, Society and Democracy 951911
European Commission
AI4EU – A European AI On Demand Platform and Ecosystem 825619
European Commission