Published June 1, 2023 | Version Submitted Manuscript
Conference paper Open

Fault Pruning: Robust Training of Neural Networks with Memristive Weights

  • 1. Institute of Theoretical Computer Science, Graz University of Technology
  • 2. School of Engineering, University of Edinburgh


Abstract. Neural networks with memristive memory for weights have been proposed as an energy-efficient solution for scaling up of neural network implementations. However, training such memristive neural networks is still challenging due to various memristor imperfections and faulty memristive elements. Such imperfections and faults are becoming increasingly severe as the density of memristor arrays increases in order to scale up weight memory. Here, we propose fault pruning, a robust training scheme for memristive neural networks based on the idea to identify faulty memristive behavior on the fly during training and prune
corresponding connections. We test this algorithm in simulations of memristive neural networks using both feed-forward and convolutional architectures on standard object recognition data sets. We show that fault pruning is able to mitigate the detrimental effect of memristor faults on network training.


This preprint has not undergone peer review or any post-submission improvements or corrections. The Version of Record of this contribution is published in Unconventional Computation and Natural Computation. UCNC 2023. Lecture Notes in Computer Science, vol 14003. Springer, Cham. and is available online at



Files (776.6 kB)

Name Size Download all
776.6 kB Preview Download

Additional details


SYNCH – A SYnaptically connected brain-silicon Neural Closed-loop Hybrid system 824162
European Commission
Spiking Memristive Architectures for Learning to Learn I 4670
FWF Austrian Science Fund