Skip to main content

Byzantine-Robust and Communication-Efficient Personalized Federated Learning

Xuechao He (Sun Yat-sen University); Jiaojiao Zhang (The Chinese University of Hong Kong); Qing Ling (Sun Yat-Sen University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

This paper investigates personalized federated learning, in which a group of workers are coordinated by a server to train correlated local models, in addition to a common global model. This distributed statistic learning problem faces two challenges: efficiency of information exchange between the workers and the server, and robustness to potential malicious messages from the so-called Byzantine workers. We propose a projected stochastic block gradient descent method to address the robustness issue. Therein, each regular worker learns in a personalized manner with the aid of the global model, and the server judiciously aggregates the local models via a Huber function-based descent step. To improve communication efficiency, we allow the regular workers to perform multi-steps of local update per communication round. Convergence of the proposed method is established for non-convex personalized federated learning. Numerical experiments on neural network training validate advantages of the proposed method over the existing ones.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00