Conference paper Open Access

Gated Echo State Networks: a preliminary study

Di Sarli, Daniele; Gallicchio, Claudio; Micheli, Alessio

Gating mechanisms are widely used in the context of Recurrent Neural Networks (RNNs) to improve the network's ability to deal with long-term dependencies within the data. The typical approach for training such networks involves the expensive algorithm of gradient descent and backpropagation. On the other hand, Reservoir Computing (RC) approaches like Echo State Networks (ESNs) are extremely efficient in terms of training time and resources thanks to their use of randomly initialized parameters that do not need to be trained. Unfortunately, basic ESNs are also unable to effectively deal with complex long-term dependencies. In this work, we start investigating the problem of equipping ESNs with gating mechanisms. Under rigorous experimental settings, we compare the behaviour of an ESN with randomized gate parameters (initialized with RC techniques) against several other models, among which a leaky ESN and a fully trained gated RNN. We observe that the use of randomized gates by itself can increase the predictive accuracy of a ESN, but this increase is not meaningful when compared with other techniques. Given these results, we propose a research direction for successfully designing ESN models with gating mechanisms.

Files (221.5 kB)
Name Size
221.5 kB Download
Views 42
Downloads 199
Data volume 44.1 MB
Unique views 41
Unique downloads 195


Cite as