Skip to main content

BCKD: BLOCK-CORRELATION KNOWLEDGE DISTILLATION

Qi Wang, Lu Liu, Wenxin Yu, Shiyu Chen, Jun Gong, Peng Chen

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
Lecture 11 Oct 2023

In this paper, we propose Block-Correlation Knowledge Distillation (BCKD), a novel and efficient knowledge distillation method that differs from the classical method, using the simple multilayer-perceptron (MLP) and the classifier of the pre-trained teacher to train the correlations between adjacent blocks of the model. Over the past few years, the performance of some methods has been restricted by the feature map size or the lack of samples in small-scale datasets. By our proposed BCKD, the above problem is satisfactorily solved and has a superior performance without introducing additional overhead. Our method is validated on CIFAR100 and CIFAR10 datasets, and experimental results demonstrate the effectiveness and superiority of our method.