Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 11:57
28 Oct 2020

Modern regularizers for image restoration are mostly nonquadratic and nonsmooth. They have been intensely researched and their capacity for promoting sparsity has been successfully exploited. In particular, nonquadratic regularizers are known to perform better than classical quadratic regularizers. However, in this work, we propose a quadratic regularizer of the form x^T Q x whose restoration capacity is superior to total-variation and Hessian regularization. The catch is that, unlike classical regularization (e.g. Tikhonov), the matrix Q is data-driven---it is computed from the observed image via a kernel (affinity) matrix. For linear restoration problems with quadratic data-fidelity (e.g. superresolution and deconvolution), the overall optimization reduces to solving a linear system; this can be done efficiently using conjugate gradient. The attractive aspect is that we are able to avoid the inner iterations in total-variation and Hessian regularization. In a sense, the proposed regularizer combines the computational efficiency of quadratic regularizers and the restoration (image modeling) power of nonquadratic regularizers.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00