Conference paper Open Access

Continual Learning with Echo State Networks

Andrea Cossu; Davide Bacciu; Antonio Carta; Claudio Gallicchio; Vincenzo Lomonaco

Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="" xmlns:oai_dc="" xmlns:xsi="" xsi:schemaLocation="">
  <dc:creator>Andrea Cossu</dc:creator>
  <dc:creator>Davide Bacciu</dc:creator>
  <dc:creator>Antonio Carta</dc:creator>
  <dc:creator>Claudio Gallicchio</dc:creator>
  <dc:creator>Vincenzo Lomonaco</dc:creator>
  <dc:description>Continual Learning (CL) refers to a learning setup where
data is non stationary and the model has to learn without forgetting ex-
isting knowledge. The study of CL for sequential patterns revolves around
trained recurrent networks. In this work, instead, we introduce CL in the
context of Echo State Networks (ESNs), where the recurrent component
is kept fixed. We provide the first evaluation of catastrophic forgetting in
ESNs and we highlight the benefits in using CL strategies which are not
applicable to trained recurrent models. Our results confirm the ESN as a
promising model for CL and open to its use in streaming scenarios.</dc:description>
  <dc:relation>info:eu-repo/grantAgreement/EC/Horizon 2020 Framework Programme - Research and Innovation action/871385/</dc:relation>
  <dc:subject>continual learning; echo state networks; recurrent neural networks</dc:subject>
  <dc:title>Continual Learning with Echo State Networks</dc:title>
All versions This version
Views 2323
Downloads 1616
Data volume 2.0 MB2.0 MB
Unique views 2020
Unique downloads 1515


Cite as