Skip to main content

Revisit Of Estimate Sequence For Accelerated Gradient Method

Bingcong Li, Mario Coutino, Georgios B. Giannakis

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 13:59
04 May 2020

In this paper, we revisit the problem of minimizing a convex function $f(\mathbf{x})$ with Lipschitz continuous gradient via accelerated gradient methods (AGM). To do so, we consider the so-called estimate sequence (ES), a useful analysis tool for establishing the convergence of AGM. We develop a generalized ES to support Lipschitz continuous gradient on \textit{any} norm, given the importance of considering non-Euclidian norms in optimization. Traditionally, ES consists of a sequence of quadratic functions that serves as surrogate functions of $f(\mathbf{x})$. However, such quadratic functions preclude the possibility of supporting Lipschitz continuous gradient defined w.r.t. non-Euclidian norms. Hence, an extension of such a powerful tool to the non-Euclidian norm setting is so much needed. Such extension is accomplished through a \textit{simple} yet nontrivial modification of the standard ES. Further, our analysis provides insights of how acceleration is achieved and interpretability of the involved parameters in ES. Finally, numerical tests demonstrate the convergence benefits of taking non-Euclidean norms into account.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00