Skip to main content

Dose-Blind Denoising With Deep Learning in Cardiac Spect

Junchi Liu, Yongyi Yang, Miles Wernick, Hendrik Pretorius, Michael King

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:11:47
17 Oct 2022

Knowledge distillation enables us to transfer the knowledge from a large and complex neural network into a smaller and faster one. This allows for improving the accuracy of the smaller network. However, directly transferring the knowledge between enormous feature maps, as they are extracted from convolutional layers, is not straightforward. in this work, we propose an efficient mutual information-based approach for transferring the knowledge between feature maps extracted from different networks. The proposed method employs an efficient Neural Bag-of-Features formulation to estimate the joint and marginal probabilities and then optimizes the whole pipeline in an end-to-end manner. The effectiveness of the proposed method is demonstrated using a lightweight, fully convolutional neural network architecture, which aims toward high-resolution analysis and targets photonic neural network accelerators.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00