Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:13:44
10 Jun 2021

Learning a suitable graph is an important precursor to many graph signal processing (GSP) pipelines, such as graph spectral signal compression and denoising. Previous graph learning algorithms either i) make no assumptions on connectivity beyond graph sparsity, or ii) make simple graph edge assumptions such as positive edges only. In this paper, given an empirical covariance matrix computed from data as input, we consider a structural assumption on the precision matrix directly: the first K eigenvector of the precision matrix P are pre-selected or optimized based on domain-specific criteria such as computation constraint, and the remaining eigenvectors are then learned from data. We first prove that the subspace of real positive semi-definite matrices with K eigenvector being {u_k} in a defined Hilbert space is a convex cone. We then constructed an efficient projection operator to project a given matrix P to the defined convex cone, inspired by the Gram-Schmidt procedure. Finally, we design an efficient hybrid graphical lasso/projection algorithm to compute the most suitable precision matrix P given an empirical covariance matrix. Experimental results show that given the first K eigenvectors as a prior, our algorithm outperformed competing graph learning schemes noticeably in a variety of graph comparison metrics.

Chairs:
Elvin Isufi

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00