Published August 5, 2021 | Version v1
Journal article Open

Continual learning for recurrent neural networks: An empirical evaluation

  • 1. University of Pisa

Description

Learning continuously during all model lifetime is fundamental to deploy machine learning solutions robust to drifts in the data distribution. Advances in Continual Learning (CL) with recurrent neural networks could pave the way to a large number of applications where incoming data is non stationary, like natural language processing and robotics. However, the existing body of work on the topic is still fragmented, with approaches which are application-specific and whose assessment is based on heterogeneous learning protocols and datasets. In this paper, we organize the literature on CL for sequential data processing by providing a categorization of the contributions and a review of the benchmarks. We propose two new benchmarks for CL with sequential data based on existing datasets, whose characteristics resemble real-world applications.

We also provide a broad empirical evaluation of CL and Recurrent Neural Networks in class-incremental scenario, by testing their ability to mitigate forgetting with a number of different strategies which are not specific to sequential data processing. Our results highlight the key role played by the sequence length and the importance of a clear specification of the CL scenario.

Files

2103.07492.pdf

Files (1.6 MB)

Name Size Download all
md5:e168aef7dad0e2f8f6d5d8711223b168
1.6 MB Preview Download

Additional details

Funding

TEACHING – A computing toolkit for building efficient autonomous applications leveraging humanistic intelligence 871385
European Commission