Published June 1, 2023 | Version Submitted Manuscript
Conference paper Open

Fault Pruning: Robust Training of Neural Networks with Memristive Weights

  • 1. Institute of Theoretical Computer Science, Graz University of Technology
  • 2. School of Engineering, University of Edinburgh

Description

Abstract. Neural networks with memristive memory for weights have been proposed as an energy-efficient solution for scaling up of neural network implementations. However, training such memristive neural networks is still challenging due to various memristor imperfections and faulty memristive elements. Such imperfections and faults are becoming increasingly severe as the density of memristor arrays increases in order to scale up weight memory. Here, we propose fault pruning, a robust training scheme for memristive neural networks based on the idea to identify faulty memristive behavior on the fly during training and prune
corresponding connections. We test this algorithm in simulations of memristive neural networks using both feed-forward and convolutional architectures on standard object recognition data sets. We show that fault pruning is able to mitigate the detrimental effect of memristor faults on network training.

Notes

This preprint has not undergone peer review or any post-submission improvements or corrections. The Version of Record of this contribution is published in Unconventional Computation and Natural Computation. UCNC 2023. Lecture Notes in Computer Science, vol 14003. Springer, Cham. and is available online at https://doi.org/10.1007/978-3-031-34034-5_9.

Files

Fault_pruning_Submitted_Manuscript_Added_DOI.pdf

Files (776.6 kB)

Name Size Download all
md5:aa4c6865740fda64a6e0cd6866e809c8
776.6 kB Preview Download

Additional details

Funding

SYNCH – A SYnaptically connected brain-silicon Neural Closed-loop Hybrid system 824162
European Commission
Spiking Memristive Architectures for Learning to Learn I 4670
FWF Austrian Science Fund