Skip to main content

Knowledge Distillation with Active Exploration and Self-attention based Inter-Class Variation Transfer For Image Segmentation

Yifan Zhang (Shenzhen University); Shaojie Li (Shenzhen University); Xuan Yang (Shenzhen University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Knowledge distillation (KD) aims to distill the knowledge from a more extensive deep neural network into a small network without losing validity. This paper proposes a novel approach with active exploration and passive transfer (AEPT) and self-attention-based inter-class feature variation (AIFV) distillation for the cardiac image segmentation task. The AEPT encourages the student model to learn undiscovered knowledge/features of the teacher model, aiming to explore new features outperforming the teacher. More specifically, we focus on where the teacher performs poorly and encourage the student model to explore complementary features different from the teacher model. To improve the distinguishability of the student for different classes, the student learns the self-attention-based feature variation (AIFV) between classes. Extensive experiments on medical and public image datasets demonstrate that our approach makes the student model learn better representations and outperforms state-of-art methods by combining two knowledge distillation strategies.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00