Skip to main content

Accelerated Distributed Stochastic Non-Convex Optimization over Time-Varying Directed Networks

Yiyue Chen (University of Texas at Austin); Abolfazl Hashemi (Purdue University); Haris Vikalo (University of Texas at Austin)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Jun 2023

We study non-convex optimization problems where the data is distributed across nodes of a time-varying directed network; this describes dynamic settings in which the communication between network nodes is affected by delays or link failures. The network nodes, which can access only their local objectives and query a stochastic first-order oracle for the gradient estimates, collaborate by exchanging messages with their neighbors to minimize a global objective function. We propose an algorithm for non-convex optimization problems in such settings that leverages stochastic gradient descent with momentum and gradient tracking. We further prove, by analyzing dynamic network systems with gradient acceleration, that the oracle complexity of the proposed algorithm is $\mathcal{O}(1/\epsilon^{1.5})$. The results demonstrate superior performance of the proposed framework compared to state-of-the-art related methods used in a variety of machine learning tasks.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00