This paper studies the performance of Floating Gossip, a novel decentralized approach for Gossip Learning at the network edge. Floating Gossip utilizes Floating Content to facilitate location-based probabilistic evolution of Machine Learning models, without external infrastructure support. We investigate dynamic scenarios requiring continuous learning, leveraging a mean field approach to analyze Floating Gossip’s performance boundaries. Our focus is on the quantity of data that users can integrate into their models, as a function of key system parameters. Unlike previous studies that separately optimize communication or computational aspects of Gossip Learning, our methodology considers their combined effect. We validate our analysis through comprehensive simulations, demonstrating the high accuracy of our analytical model. Our methodology reveals Floating Gossip’s effectiveness in training and updating Machine Learning models collaboratively, leveraging opportunistic exchanges between mobile users, while flexibly adapting to different user characteristics and mobility patterns. This research highlights Floating Gossip’s potential for continuous, cooperative model training in dynamic, infrastructure-less environments, offering insight into its performance patterns and its potential in practical applications.

Rizzo, G., Perez Palma, N., Ajmone Marsan, M., Mancuso, V. (2026). Floating Gossip: Serverless Distributed Learning in Dynamic Scenarios. IEEE TRANSACTIONS ON MOBILE COMPUTING [10.1109/TMC.2026.3682564].

Floating Gossip: Serverless Distributed Learning in Dynamic Scenarios

Vincenzo Mancuso
2026-01-01

Abstract

This paper studies the performance of Floating Gossip, a novel decentralized approach for Gossip Learning at the network edge. Floating Gossip utilizes Floating Content to facilitate location-based probabilistic evolution of Machine Learning models, without external infrastructure support. We investigate dynamic scenarios requiring continuous learning, leveraging a mean field approach to analyze Floating Gossip’s performance boundaries. Our focus is on the quantity of data that users can integrate into their models, as a function of key system parameters. Unlike previous studies that separately optimize communication or computational aspects of Gossip Learning, our methodology considers their combined effect. We validate our analysis through comprehensive simulations, demonstrating the high accuracy of our analytical model. Our methodology reveals Floating Gossip’s effectiveness in training and updating Machine Learning models collaboratively, leveraging opportunistic exchanges between mobile users, while flexibly adapting to different user characteristics and mobility patterns. This research highlights Floating Gossip’s potential for continuous, cooperative model training in dynamic, infrastructure-less environments, offering insight into its performance patterns and its potential in practical applications.
2026
Rizzo, G., Perez Palma, N., Ajmone Marsan, M., Mancuso, V. (2026). Floating Gossip: Serverless Distributed Learning in Dynamic Scenarios. IEEE TRANSACTIONS ON MOBILE COMPUTING [10.1109/TMC.2026.3682564].
File in questo prodotto:
File Dimensione Formato  
Floating_Gossip_Journal___TMC_2025__revised.pdf

Solo gestori archvio

Tipologia: Pre-print
Dimensione 1.71 MB
Formato Adobe PDF
1.71 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Floating_Gossip_Serverless_Distributed_Learning_in_Dynamic_Scenarios.pdf

Solo gestori archvio

Descrizione: ahead of print
Tipologia: Versione Editoriale
Dimensione 1.88 MB
Formato Adobe PDF
1.88 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10447/704885
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact