Online federated learning with imbalanced class distribution
The federated learning paradigm can be a viable solution for handling huge datasets, and for taking advantage of powerful processing nodes on the edge. The process of online federated learning can be employed in order to maximise the potential of federated learning by re-training a shared model on the edge nodes and merging the updated models centrally. This approach allows edge nodes to exchange knowledge without exchanging their own training data, thus preserving their privacy. In this work, we examine the online federated learning approach in an extreme case of imbalanced class distribution between the central and the edge nodes. We examine the effects of different parameters of the online federated learning process and propose a technique that boosts the classification performance above that of the baseline centralised learning approach.
||1.1 MB||Preview Download|