Conference paper Open Access
Gkalelis, Nikolaos; Mezaris, Vasileios
In this paper, a novel pruning framework is introduced to compress noisy or less discriminant filters in small fractional steps, in deep convolutional networks. The proposed framework utilizes a class-separability criterion that can exploit effectively the labeling information in annotated training sets. Additionally, an asymptotic schedule for the pruning rate and scaling factor is adopted so that the selected filters’ weights collapse gradually to zero, providing improved robustness. Experimental results on the CIFAR-10, Google speech commands (GSC) and ImageNet32 (a downsampled version of ILSVRC-2012) show the efficacy of the proposed approach.
Name | Size | |
---|---|---|
icme2020_preprint.pdf
md5:471138a82e4b2891f5b6e1c2c5c0c9c7 |
720.3 kB | Download |
Views | 242 |
Downloads | 43 |
Data volume | 31.0 MB |
Unique views | 239 |
Unique downloads | 43 |