Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
Poster 11 Oct 2023

Federated learning (FL) enables collaborative model training across clients while preserving data privacy. However, FL faces the challenge of data heterogeneity, leading to biased local models that deviate from the global model during optimization. And existing FL algorithms like Federated Averaging (FedAvg) suffer from this issue. To address this problem, we propose a novel approach called multi-branch prototype federated learning (FedMBP). FedMBP creates auxiliary branches within each local model to integrate different levels of local and global prototypes, thus preventing local model drift by aligning local prototypes with global ones. We also introduce mixed cross-entropy on the auxiliary branches to effectively transfer global prototype knowledge to local models. We conduct experiments on three publicly available datasets, including natural and medical image domains. Our experiments demonstrate that FedMBP outperforms existing FL algorithms, achieving superior model performance.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00