Conference paper Open Access

Federated Reservoir Computing Neural Networks

Bacciu, Davide; Di Sarli, Daniele; Faraji, Pouria; Gallicchio, Claudio; Micheli, Alessio


DataCite XML Export

<?xml version='1.0' encoding='utf-8'?>
<resource xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://datacite.org/schema/kernel-4" xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4.1/metadata.xsd">
  <identifier identifierType="DOI">10.5281/zenodo.5256697</identifier>
  <creators>
    <creator>
      <creatorName>Bacciu, Davide</creatorName>
      <givenName>Davide</givenName>
      <familyName>Bacciu</familyName>
      <affiliation>University of Pisa</affiliation>
    </creator>
    <creator>
      <creatorName>Di Sarli, Daniele</creatorName>
      <givenName>Daniele</givenName>
      <familyName>Di Sarli</familyName>
      <affiliation>University of Pisa</affiliation>
    </creator>
    <creator>
      <creatorName>Faraji, Pouria</creatorName>
      <givenName>Pouria</givenName>
      <familyName>Faraji</familyName>
    </creator>
    <creator>
      <creatorName>Gallicchio, Claudio</creatorName>
      <givenName>Claudio</givenName>
      <familyName>Gallicchio</familyName>
    </creator>
    <creator>
      <creatorName>Micheli, Alessio</creatorName>
      <givenName>Alessio</givenName>
      <familyName>Micheli</familyName>
    </creator>
  </creators>
  <titles>
    <title>Federated Reservoir Computing Neural Networks</title>
  </titles>
  <publisher>Zenodo</publisher>
  <publicationYear>2021</publicationYear>
  <subjects>
    <subject>Reservoir Computing</subject>
    <subject>Federated Learning</subject>
    <subject>Recurrent Neural Networks</subject>
  </subjects>
  <dates>
    <date dateType="Issued">2021-08-25</date>
  </dates>
  <resourceType resourceTypeGeneral="ConferencePaper"/>
  <alternateIdentifiers>
    <alternateIdentifier alternateIdentifierType="url">https://zenodo.org/record/5256697</alternateIdentifier>
  </alternateIdentifiers>
  <relatedIdentifiers>
    <relatedIdentifier relatedIdentifierType="DOI" relationType="IsVersionOf">10.5281/zenodo.5256696</relatedIdentifier>
    <relatedIdentifier relatedIdentifierType="URL" relationType="IsPartOf">https://zenodo.org/communities/teaching-h2020</relatedIdentifier>
  </relatedIdentifiers>
  <rightsList>
    <rights rightsURI="https://creativecommons.org/licenses/by/4.0/legalcode">Creative Commons Attribution 4.0 International</rights>
    <rights rightsURI="info:eu-repo/semantics/openAccess">Open Access</rights>
  </rightsList>
  <descriptions>
    <description descriptionType="Abstract">&lt;p&gt;A critical aspect in Federated Learning is the aggregation strategy for the combination of multiple models, trained on the edge, into a single model that incorporates all the knowledge in the federation. Common Federated Learning approaches for Recurrent Neural Networks (RNNs) do not provide guarantees on the predictive performance of the aggregated model. In this paper we show how the use of Echo State Networks (ESNs), which are efficient state-of-the-art RNN models for time-series processing, enables a form of federation that is optimal in the sense that it produces models mathematically equivalent to the corresponding centralized model. Furthermore, the proposed method is compliant with privacy constraints.&lt;br&gt;
The proposed method, which we denote as Incremental Federated Learning, is experimentally evaluated against an averaging strategy on two datasets for human state and activity recognition.&lt;/p&gt;</description>
  </descriptions>
</resource>
27
48
views
downloads
All versions This version
Views 2727
Downloads 4848
Data volume 10.7 MB10.7 MB
Unique views 2323
Unique downloads 4242

Share

Cite as