Skip to main content

Boosting Semi-Supervised Federated Learning with Model Personalization and Client-Variance-Reduction

Shuai Wang (Singapore University of Technology and Design); Yanqing Xu (The Chinese University of HongKong, Shenzhen); Yanli Yuan (Singapore University of Technology and Design); Xiuhua Wang (Huazhong University of Science and Technology); Tony Quek (Singapore University of Technology and Design)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
08 Jun 2023

Recently, federated learning (FL) has been increasingly appealing in distributed signal processing and machine learning. Nevertheless, the practical challenges of label deficiency and client heterogeneity form a bottleneck to its wide adoption. Although numerous efforts have been devoted to semi-supervised FL, most of the adopted algorithms follow the same spirit as FedAvg, thus heavily suffering from the adverse effects caused by client heterogeneity. In this paper, we boost the semi-supervised FL by addressing the issue using model personalization and client-variance-reduction. In particular, we propose a novel and unified problem formulation based on pseudo-labeling and model interpolation. We then propose an effective algorithm, named FedCPSL, which judiciously adopts the schemes of a novel momentum-based client-variance-reduction and normalized averaging. Convergence property of FedCPSL is analyzed and shows that FedCPSL is resilient to client heterogeneity and obtains a sublinear convergence rate. Experimental results on image classification tasks are also presented to demonstrate the efficacy of FedCPSL over the benchmark algorithms.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00