Skip to main content

ENLIGHTENING THE STUDENT IN KNOWLEDGE DISTILLATION

Yujie Zheng (Ningbo University); Chong Wang (Ningbo University); Yi Chen (Ningbo University); Jiangbo Qian (Ningbo University); Jun Wang (China University of Mining and Technology); JIAFEI WU (SenseTime Research)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Jun 2023

Knowledge distillation is a common method of model compression, which uses large models (teacher networks) to guide the training of small models (student networks). However, the student may find a hard time absorbing the knowledge from a sophisticated teacher due to the capacity and confidence gaps between them. To address this issue, a new knowledge distillation and refinement (KDrefine) framework is proposed to enlighten the student by expending and refining its network structure. In addition, a confidence refinement strategy is utilized to generate adaptive soften logits for efficient distillation. The experiments show that the proposed framework outperforms state-of-the-art methods on both CIFAR-100 and Tiny-ImageNet datasets. The code is available at https://github.com/YujieZheng99/KDrefine.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00