Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Pages/Slides: 32
15 Feb 2023

Decentralized stochastic gradient descent (SGD) is a driving engine for decentralized federated learning (DFL). The performance of decentralized SGD is jointly influenced by inter-node communications and local updates. This webinar will introduce a general DFL framework, which implements both multiple local updates and multiple inter-node communications periodically, to strike a balance between communication efficiency and model consensus. We will first describe a general system model of DFL, and then present the proposed DFL framework along with several existing learning strategies of DFL. We will show the theoretical convergence performance for the proposed DFL algorithm without the assumption of convex objectives. Furthermore, in this talk we will include a compressed communication scheme based on the proposed DFL framework, named C-DFL, to improve communication efficiency. Finally, we will present experiment results based on MNIST and CIFAR-10 datasets, illustrating the superiority of DFL over traditional decentralized SGD methods and demonstrating that C-DFL further enhances communication efficiency.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: $10.00
    IEEE Members: $22.00
    Non-members: $30.00