Published August 5, 2021
| Version v1
Conference paper
Open
Continual Learning with Echo State Networks
- 1. University of Pisa
Description
Continual Learning (CL) refers to a learning setup where
data is non stationary and the model has to learn without forgetting ex-
isting knowledge. The study of CL for sequential patterns revolves around
trained recurrent networks. In this work, instead, we introduce CL in the
context of Echo State Networks (ESNs), where the recurrent component
is kept fixed. We provide the first evaluation of catastrophic forgetting in
ESNs and we highlight the benefits in using CL strategies which are not
applicable to trained recurrent models. Our results confirm the ESN as a
promising model for CL and open to its use in streaming scenarios.
Files
esn.pdf
Files
(124.0 kB)
Name | Size | Download all |
---|---|---|
md5:2360d995c348273be972488494dc4daa
|
124.0 kB | Preview Download |