There is a newer version of the record available.

Published June 25, 2018 | Version v0.2.0
Software Open

Neural Network Distiller

  • 1. Intel AI Lab

Description

  • PyTorch 0.4 support
  • An implementation of Baidu's RNN pruning paper from ICLR 2017 Narang, Sharan & Diamos, Gregory & Sengupta, Shubho & Elsen, Erich. (2017). <i>Exploring Sparsity in Recurrent Neural Networks</i>. (https://arxiv.org/abs/1704.05119)
  • Add a word language model pruning example using AGP and Baidu RNN pruning
  • Quantization aware training (4-bit quantization)
  • New models: pre-activation ResNet for ImageNet and CIFAR, and AlexNet with batch-norm
  • New quantization documentation content

Files

NervanaSystems/distiller-v0.2.0.zip

Files (21.1 MB)

Name Size Download all
md5:661012107395b91072725c84c0eccbc0
21.1 MB Preview Download

Additional details