5256697
doi
10.5281/zenodo.5256697
oai:zenodo.org:5256697
user-teaching-h2020
Di Sarli, Daniele
University of Pisa
Faraji, Pouria
Gallicchio, Claudio
Micheli, Alessio
Federated Reservoir Computing Neural Networks
Bacciu, Davide
University of Pisa
info:eu-repo/semantics/openAccess
Creative Commons Attribution 4.0 International
https://creativecommons.org/licenses/by/4.0/legalcode
Reservoir Computing
Federated Learning
Recurrent Neural Networks
<p>A critical aspect in Federated Learning is the aggregation strategy for the combination of multiple models, trained on the edge, into a single model that incorporates all the knowledge in the federation. Common Federated Learning approaches for Recurrent Neural Networks (RNNs) do not provide guarantees on the predictive performance of the aggregated model. In this paper we show how the use of Echo State Networks (ESNs), which are efficient state-of-the-art RNN models for time-series processing, enables a form of federation that is optimal in the sense that it produces models mathematically equivalent to the corresponding centralized model. Furthermore, the proposed method is compliant with privacy constraints.<br>
The proposed method, which we denote as Incremental Federated Learning, is experimentally evaluated against an averaging strategy on two datasets for human state and activity recognition.</p>
Zenodo
2021-08-25
info:eu-repo/semantics/conferencePaper
5256696
user-teaching-h2020
1629964145.621078
223420
md5:a45073f203ce93fa1047d47766be0c61
https://zenodo.org/records/5256697/files/Incremental_Federated_Learning__Zenodo_.pdf
public
10.5281/zenodo.5256696
isVersionOf
doi