loading page

RingSFL: An Adaptive Split Federated Learning Towards Taming Client Heterogeneity
  • +5
  • Jinglong Shen ,
  • Nan Cheng ,
  • Xiucheng Wang ,
  • Feng Lyu ,
  • Wenchao Xu ,
  • Zhi Liu ,
  • Khalid Aldubaikhy ,
  • Xuemin Shen
Jinglong Shen
Xidian University

Corresponding Author:[email protected]

Author Profile
Nan Cheng
Author Profile
Xiucheng Wang
Author Profile
Wenchao Xu
Author Profile
Khalid Aldubaikhy
Author Profile
Xuemin Shen
Author Profile

Abstract

Federated learning (FL) has gained increasing attention due to its ability to collaboratively train while protecting client data privacy. However, vanilla FL cannot adapt to client heterogeneity, leading to a degradation in training efficiency due to stragglers, and is still vulnerable to privacy leakage. To address these issues, this paper proposes RingSFL, a novel distributed learning scheme that integrates FL with a model split mechanism to adapt to client heterogeneity while maintaining data privacy. In RingSFL, all clients form a ring topology. For each client, instead of training the model locally, the model is split and trained among all clients along the ring through a pre-defined direction. By properly setting the propagation lengths of heterogeneous clients, the straggler effect is mitigated, and the training efficiency of the system is significantly enhanced. Additionally, since the local models is blended, it is less likely for an eavesdropper to obtain the complete model and recover the raw data, thus improving data privacy. The experimental results on both simulation and prototype systems show that RingSFL can achieve better convergence performance than benchmark methods on independently identically distributed (IID) and non-IID datasets, while effectively preventing eavesdroppers from recovering training data.
2023Published in IEEE Transactions on Mobile Computing on pages 1-16. 10.1109/TMC.2023.3309633