VARIANCE REDUCTION-BOOSTED BYZANTINE ROBUSTNESS IN DECENTRALIZED STOCHASTIC OPTIMIZATION
Jie Peng, Qing Ling, Weiyu Li
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:12:21
We consider the Byzantine-robust decentralized stochastic optimization problem, where every agent periodically communicates with its neighbors to exchange the local models, and then updates its own local model by stochastic gradient descent. However, an unknown number of the agents are Byzantine, and perform adversarially during the optimization process. Few works have considered this challenging scenario, and an existing method termed DECEMBER is unable to simultaneously achieve linear convergence speed and small learning error due to the stochastic noise. To eliminate the negative effect of the stochastic noise, we introduce two variance reduction methods, stochastic average gradient algorithm (SAGA) and loopless stochastic variance-reduced gradient (LSVRG), to Byzantine-robust decentralized stochastic optimization. The two resulting methods, DECEMBER-SAGA and DECEMBER-LSVRG, enjoy both linear convergence speeds and small learning errors. Numerical experiments demonstrate their effectiveness.