Resilient To Byzantine Attacks Finite-Sum Optimization Over Networks
Zhaoxian Wu, Qing Ling, Tianyi Chen, Georgios B. Giannakis
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 14:57
This contribution deals with distributed finite-sum optimization for learning over networks in the presence of malicious Byzantine attacks. To cope with such attacks, resilient approaches so far combine stochastic gradient descent (SGD) with different robust aggregation rules. However, the sizeable SGD-induced gradient noise makes it challenging to distinguish malicious messages sent by the Byzantine attackers from noisy stochastic gradients sent by the friendly workers. This motivates gradient noise reduction as a means of robustifying SGD in the presence of Byzantine attacks. To this end, the present work puts forth a Byzantine attack resilient distributed (Byrd-) SAGA approach for learning tasks involving finite-sum optimization over networks. Rather than the mean employed by distributed SAGA, the novel Byrd-SAGA relies on the geometric median to aggregate the corrected stochastic gradients sent by the workers. When less than half of the workers are Byzantine attackers, the robustness of geometric median to outliers enables Byrd-SAGA to achieve provable linear convergence to a neighborhood of the optimal solution, where the size of neighborhood is determined by the number of Byzantine workers. Numerical tests demonstrate the robustness of Byrd-SAGA to various Byzantine attacks, as well as the merits of Byrd-SAGA over Byzantine-resilient SGD.