Conference paper Open Access

Continual Learning with Echo State Networks

Andrea Cossu; Davide Bacciu; Antonio Carta; Claudio Gallicchio; Vincenzo Lomonaco

Continual Learning (CL) refers to a learning setup where
data is non stationary and the model has to learn without forgetting ex-
isting knowledge. The study of CL for sequential patterns revolves around
trained recurrent networks. In this work, instead, we introduce CL in the
context of Echo State Networks (ESNs), where the recurrent component
is kept fixed. We provide the first evaluation of catastrophic forgetting in
ESNs and we highlight the benefits in using CL strategies which are not
applicable to trained recurrent models. Our results confirm the ESN as a
promising model for CL and open to its use in streaming scenarios.

Files (124.0 kB)
Name Size
esn.pdf
md5:2360d995c348273be972488494dc4daa
124.0 kB Download
18
15
views
downloads
All versions This version
Views 1818
Downloads 1515
Data volume 1.9 MB1.9 MB
Unique views 1515
Unique downloads 1414

Share

Cite as