Primal-Dual Stochastic Subgradient Method For Log-Determinant Optimization
Songwei Wu, Hang Yu, Justin Dauwels
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 15:01
The log-determinant optimization problem with general matrix constraints arises in many applications. The log-determinant term hampers the scalability of existing methods. This paper proposes a highly efficient stochastic method that has time complexity O(N^2), whereas existing methods have complexity O(N^3). In order to achieve the quadratic complexity, the proposed algorithm leverages an efficient stochastic gradient of the augmented Lagrangian form and relies on subgradient descent method. Convergence of this method is analyzed both theoretically and empirically. The resulting primal-dual stochastic subgradient method yields the same accuracy as existing methods yet only requires O(N^2) operations.