Published September 5, 2021 | Version v1
Preprint Open

Impact of Network Topology on the Convergence of Decentralized Federated Learning Systems

Description

Federated learning is a popular framework that enables harvesting edge resources’ computational power to train a machine learning model distributively. However, it is not always feasible or profitable to have a centralized server that controls and synchronizes the training process. In this paper, we consider the problem of training a machine learning model over a network of nodes in a fully decentralized fashion. In particular, we look for empirical evidence on how sensitive is the training process for various network characteristics and communication parameters. We present the outcome of several simulations conducted with different network topologies, datasets, and machine learning models.

Files

Distributed_FL.pdf

Files (556.4 kB)

Name Size Download all
md5:31593e3c2984fa4221872dd3efcad2db
556.4 kB Preview Download

Additional details

Related works

Is published in
Conference paper: 10.1109/ISCC53001.2021.9631460 (DOI)

Funding

TEACHING – A computing toolkit for building efficient autonomous applications leveraging humanistic intelligence 871385
European Commission