Evaluating Hebbian Learning in a Semi-supervised Setting
- 1. University of Pisa
- 2. CNR-ISTI
Description
We propose a semi-supervised learning strategy for deep Convolutional Neural Networks (CNNs) in which an unsupervised pre-training stage, performed using biologically inspired Hebbian learning algorithms, is followed by supervised end-to-end backprop fine-tuning. We explored two Hebbian learning rules for the unsupervised pre-training stage: soft-Winner-Takes-All (soft-WTA) and nonlinear Hebbian Principal Component Analysis (HPCA). Our approach was applied in sample efficiency scenarios, where the amount of available labeled training samples is very limited, and unsupervised pre-training is therefore beneficial. We performed experiments on CIFAR10, CIFAR100, and Tiny ImageNet datasets. Our results show that Hebbian outperforms Variational Auto-Encoder (VAE) pre-training in almost all the cases, with HPCA generally performing better than soft-WTA.
Notes
Files
LOD_2021_paper_152.pdf
Files
(433.6 kB)
Name | Size | Download all |
---|---|---|
md5:af1fb588a871a8677e7eb624703ebf24
|
433.6 kB | Preview Download |