Fully Distributed Federated Learning with Efficient Local Cooperations
Evangelos Georgatos (Computer Engineering and Infomatics Dept., University of Patras); Christos Mavrokefalidis (Computer Engineering and Informatics Dept., University of Patras, Greece); Kostas Berberidis (University of Patras)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Recently, a shift has been observed towards the so-called edge machine learning, which allow multiple devices with local computational and storage resources to collaborate
with the assistance of a centralized server. The well-known
federated learning approach is able to utilize such architectures
by allowing the exchange of only parameters with the server,
while keeping the datasets private to each contributing device.
In this work, we propose a communication-efficient, fully distributed, diffusion-based learning algorithm that does not require a parameter server and propose an
adaptive combination rule for the cooperation of the devices. By
adopting a classification task on the MNIST dataset, the efficacy
of the proposed algorithm is
demonstrated in non-IID dataset scenarios.