Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:11:21
12 May 2022

Different from its commonly studied scenario to centrally store clients' data in institutions, which implicitly neglects clients' data privacy, we study cross-silo federated learning in a preferable setting to keep private data on clients, and train the global model with a three-layer structure, where the institutions aggregate model updates from their clients for several rounds before sending their aggregated updates to the central server. In this context, we mathematically prove that the number of clients' local training epochs affects the global model performance and thus propose a new approach, Tempo, to adaptively tune the epoch number of each client through training. The results of our evaluation conducted under real network environments show that Tempo can not only improve training performance in terms of global model accuracy and communication efficiency, but also the actual training time.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00