Skip to main content

Efficient personalized federated learning on selective model training

Yeting Guo (College of Computer, National University of Defense Technology); Liu Fang (Hunan University); Tongqing Zhou (National University of Defense Technology); Zhiping Cai (NUDT); Nong Xiao (N)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Jun 2023

Personalized Federated Learning (FL) handles the data heterogeneous problem by tailoring local models for each distributed data owner. Previous studies first train a highly-adaptable global model and then transfer it for personalization. However, the additional training aggravates burden of resource-limited end devices. Training a personalized local sub-network is a promising efficient solution. It normally prunes the global model by parameters' scalar magnitude. In this paper, we found that the vector magnitude, i.e. the parameter stability, could further promote personalized FL. Driven by the local data characteristics, the values of some model parameters are hardly changed in their updates. But they consume the same resources as the changed ones. Thus, we propose Star-PFL, a STability-AwaRe algorithm for efficient FL Personalization. In Star-PFL, the data owner focuses on training non-stabilized parameters, and decreases the resource wastes on stabilized ones. Experimental results on two real-world biomedical datasets demonstrate that Star-PFL improves the accuracy (3.1%↑) and decreases the resource costs (communication 36.3%↓, computation 18.3%↓) than 5 typical baselines. The code is available at https://github.com/Guoyeting/Star-PFL.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00