Federated Learning (FL) allows training machine learning models on a dataset distributed amongst multiple clients without disclosing sensitive data. Each FL client, however, might have a different data distribution, with a detrimental effect on the performance of the trained model. In this paper, we present a dynamic clustering algorithm (DCFL) that allows the server to cluster FL clients based on their model updates, letting the server adapt to changes in the data distribution and supporting the addition of new clients. Moreover, we propose a novel distance metric to estimate the distance between model updates by different clients. We evaluate our approach in a wide range of experimental settings, comparing it against the standard FedAvg algorithm and divisive clustering on the EMNIST dataset. Our approach outperforms the baselines, yielding higher accuracy and lower variance for the participating clients.

Augello A., Falzone G., Lo Re G. (2023). DCFL: Dynamic Clustered Federated Learning under Differential Privacy Settings. In 2023 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops) (pp. 614-619). Institute of Electrical and Electronics Engineers Inc. [10.1109/PerComWorkshops56833.2023.10150285].

DCFL: Dynamic Clustered Federated Learning under Differential Privacy Settings

Augello A.;Lo Re G.
2023-03-01

Abstract

Federated Learning (FL) allows training machine learning models on a dataset distributed amongst multiple clients without disclosing sensitive data. Each FL client, however, might have a different data distribution, with a detrimental effect on the performance of the trained model. In this paper, we present a dynamic clustering algorithm (DCFL) that allows the server to cluster FL clients based on their model updates, letting the server adapt to changes in the data distribution and supporting the addition of new clients. Moreover, we propose a novel distance metric to estimate the distance between model updates by different clients. We evaluate our approach in a wide range of experimental settings, comparing it against the standard FedAvg algorithm and divisive clustering on the EMNIST dataset. Our approach outperforms the baselines, yielding higher accuracy and lower variance for the participating clients.
mar-2023
Settore IINF-05/A - Sistemi di elaborazione delle informazioni
978-1-6654-5381-3
Augello A., Falzone G., Lo Re G. (2023). DCFL: Dynamic Clustered Federated Learning under Differential Privacy Settings. In 2023 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops) (pp. 614-619). Institute of Electrical and Electronics Engineers Inc. [10.1109/PerComWorkshops56833.2023.10150285].
File in questo prodotto:
File Dimensione Formato  
4. DCFL_Dynamic_Clustered_Federated_Learning_under_Differential_Privacy_Settings.pdf

Solo gestori archvio

Descrizione: Paper + TOC
Tipologia: Versione Editoriale
Dimensione 848.8 kB
Formato Adobe PDF
848.8 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10447/661473
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? ND
social impact