Federated Learning (FL) allows training machine learning models on a dataset distributed amongst multiple clients without disclosing sensitive data. Each FL client, however, might have a different data distribution, with a detrimental effect on the performance of the trained model. In this paper, we present a dynamic clustering algorithm (DCFL) that allows the server to cluster FL clients based on their model updates, letting the server adapt to changes in the data distribution and supporting the addition of new clients. Moreover, we propose a novel distance metric to estimate the distance between model updates by different clients. We evaluate our approach in a wide range of experimental settings, comparing it against the standard FedAvg algorithm and divisive clustering on the EMNIST dataset. Our approach outperforms the baselines, yielding higher accuracy and lower variance for the participating clients.
Augello A., Falzone G., Lo Re G. (2023). DCFL: Dynamic Clustered Federated Learning under Differential Privacy Settings. In 2023 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops) (pp. 614-619). Institute of Electrical and Electronics Engineers Inc. [10.1109/PerComWorkshops56833.2023.10150285].
DCFL: Dynamic Clustered Federated Learning under Differential Privacy Settings
Augello A.;Lo Re G.
2023-03-01
Abstract
Federated Learning (FL) allows training machine learning models on a dataset distributed amongst multiple clients without disclosing sensitive data. Each FL client, however, might have a different data distribution, with a detrimental effect on the performance of the trained model. In this paper, we present a dynamic clustering algorithm (DCFL) that allows the server to cluster FL clients based on their model updates, letting the server adapt to changes in the data distribution and supporting the addition of new clients. Moreover, we propose a novel distance metric to estimate the distance between model updates by different clients. We evaluate our approach in a wide range of experimental settings, comparing it against the standard FedAvg algorithm and divisive clustering on the EMNIST dataset. Our approach outperforms the baselines, yielding higher accuracy and lower variance for the participating clients.File | Dimensione | Formato | |
---|---|---|---|
4. DCFL_Dynamic_Clustered_Federated_Learning_under_Differential_Privacy_Settings.pdf
Solo gestori archvio
Descrizione: Paper + TOC
Tipologia:
Versione Editoriale
Dimensione
848.8 kB
Formato
Adobe PDF
|
848.8 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.