Conference paper Open Access

Continual Learning with Echo State Networks

Andrea Cossu; Davide Bacciu; Antonio Carta; Claudio Gallicchio; Vincenzo Lomonaco

Continual Learning (CL) refers to a learning setup where
data is non stationary and the model has to learn without forgetting ex-
isting knowledge. The study of CL for sequential patterns revolves around
trained recurrent networks. In this work, instead, we introduce CL in the
context of Echo State Networks (ESNs), where the recurrent component
is kept fixed. We provide the first evaluation of catastrophic forgetting in
ESNs and we highlight the benefits in using CL strategies which are not
applicable to trained recurrent models. Our results confirm the ESN as a
promising model for CL and open to its use in streaming scenarios.

Files (124.0 kB)
Name Size
esn.pdf
md5:2360d995c348273be972488494dc4daa
124.0 kB Download
52
46
views
downloads
All versions This version
Views 5252
Downloads 4646
Data volume 5.7 MB5.7 MB
Unique views 4848
Unique downloads 4242

Share

Cite as