Software Open Access

Neural Network Distiller

Neta Zmora; Guy Jacob; Gal Novik

  • PyTorch 0.4 support
  • An implementation of Baidu's RNN pruning paper from ICLR 2017 Narang, Sharan & Diamos, Gregory & Sengupta, Shubho & Elsen, Erich. (2017). <i>Exploring Sparsity in Recurrent Neural Networks</i>. (https://arxiv.org/abs/1704.05119)
  • Add a word language model pruning example using AGP and Baidu RNN pruning
  • Quantization aware training (4-bit quantization)
  • New models: pre-activation ResNet for ImageNet and CIFAR, and AlexNet with batch-norm
  • New quantization documentation content

Files (21.1 MB)
Name Size
NervanaSystems/distiller-v0.2.0.zip
md5:661012107395b91072725c84c0eccbc0
21.1 MB Download
153
2
views
downloads
All versions This version
Views 153153
Downloads 22
Data volume 42.2 MB42.2 MB
Unique views 148148
Unique downloads 22

Share

Cite as