Mayet, Tsiry
Lambert, Anne
Le Guyadec, Pascal
Le Bolzer, Francoise
Schnitzler, Francois
2021-05-03
<p>We introduce Skip-Window, a method to allow recurrent neural networks (RNNs) to trade off accuracy for computational cost during the analysis of a sequence. Similarly to existing approaches, Skip-Window extends existing RNN cells by adding a mechanism to encourage the model to process fewer inputs. Unlike existing approaches, Skip-Window is able to respect a strict computational budget, making this model more suitable for limited hardware like edge devices. We evaluate this approach on four datasets: a human activity recognition task, sequential MNIST, IMDB and adding task. Our results show that Skip-Window is often able to exceed the accuracy of existing approaches for a lower computational cost while strictly limiting said cost.</p>
https://doi.org/10.5281/zenodo.4911371
oai:zenodo.org:4911371
eng
Zenodo
https://zenodo.org/communities/ai4media
https://zenodo.org/communities/eu
https://doi.org/10.5281/zenodo.4911370
info:eu-repo/semantics/openAccess
Creative Commons Attribution 4.0 International
https://creativecommons.org/licenses/by/4.0/legalcode
ICLR, 9th International Conference on Learning Representations, 2021
Recurrent neural networks, Flexibility, Efficiency, Computational resources
SkipW: Resource Adaptable RNN with Strict Upper Computational Limit
info:eu-repo/semantics/conferencePaper