Published March 16, 2023 | Version v1
Journal article Open

Data-driven comparison of federated learning and model personalization for electric load forecasting


Residential short-term electric load forecasting is essential in modern decentralized power systems. Load forecasting methods mostly rely on neural networks and require access to private and sensitive electric load data for model training. Conventional neural network training aggregates all data on a centralized server to train one global model. However, the aggregation of user data introduces security and data privacy risks. In contrast, this study investigates the modern neural network training methods of federated learning and model personalization as potential solutions to security and data privacy problems. Within an extensive simulation approach, the investigated methods are compared to the conventional centralized method and a pre-trained baseline predictor to compare their respective performances. This study identifies that the underlying data structure of electric load data has a significant influence on the loss of a model. We therefore conclude that a comparison of loss distributions will in fact be considered a comparison of data structures, rather than a comparison of the model performance. As an alternative method of comparison of loss values, this study develops the "differential comparison". The method allows for the isolated comparison of model loss differences by only comparing the losses of two models generated by the same data sample to build a distribution of differences. The differential comparison method was then used to identify model personalization as the best performing model training method for load forecasting among all analyzed methods, with a superior performance in 59.1 % of all cases.


+ ID der Publikation: hslu_100139 + Art des Beitrages: Wissenschaftliche Medien + Sprache: Englisch + Letzte Aktualisierung: 2024-01-15 14:34:25



Files (3.3 MB)

Name Size Download all
3.3 MB Preview Download

Additional details

Related works

Is identical to
10.1016/j.egyai.2023.100253 (DOI)