Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 15:01
04 May 2020

The log-determinant optimization problem with general matrix constraints arises in many applications. The log-determinant term hampers the scalability of existing methods. This paper proposes a highly efficient stochastic method that has time complexity O(N^2), whereas existing methods have complexity O(N^3). In order to achieve the quadratic complexity, the proposed algorithm leverages an efficient stochastic gradient of the augmented Lagrangian form and relies on subgradient descent method. Convergence of this method is analyzed both theoretically and empirically. The resulting primal-dual stochastic subgradient method yields the same accuracy as existing methods yet only requires O(N^2) operations.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00