The Cost of Training Machine Learning Models Over Distributed Data Sources
- 1. Centre Tecnològic de Telecomunicacions de Catalunya (CTTC/CERCA)
- 2. Nokia Bell-Labs
Description
Federated learning is one of the most appealing alternatives to the standard centralized learning paradigm, allowing a heterogeneous set of devices to train a machine learning model without sharing their raw data.
However, it requires a central server to coordinate the learning process, thus introducing potential scalability and security issues. In the literature, server-less federated learning approaches like gossip federated learning and blockchain-enabled federated learning have been proposed to mitigate these issues.
In this work, we propose a complete overview of these three techniques, proposing a comparison according to an integral set of performance indicators, including model accuracy, time complexity, communication overhead, convergence time, and energy consumption.
An extensive simulation campaign permits to draw a quantitative analysis considering both feedforward and convolutional neural network models.
Results show that gossip federated learning and standard federated solution are able to reach a similar level of accuracy, and their energy consumption is influenced by the machine learning model adopted, the software library, and the hardware used.
Differently, blockchain-enabled federated learning represents a viable solution for implementing decentralized learning with a higher level of security, at the cost of an extra energy usage and data sharing.
Finally, we identify open issues on the two decentralized federated learning implementations and provide insights on potential extensions and possible research directions on this new research field.
Files
main.pdf
Files
(4.4 MB)
Name | Size | Download all |
---|---|---|
md5:07778be9fffcbc97b751225445bf8de9
|
4.4 MB | Preview Download |