Skip to main content

Clean Sample Guided Self-Knowledge Distillation For Image Classification

Jiyue Wang (South China University of Technology); Yanxiong Li (South China University of Technology); Qianhua He (SOUTH CHINA UNIVERSITY OF TECHNOLOGY); Wei Xie (South China University of Technology)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Jun 2023

For two-stage knowledge distillation, the combination with Data Augmentation (DA) is straightforward and effective. Yet, for online Self-knowledge Distillation (SD), DA is not always beneficial because of the absence of a trustworthy teacher model. To address this issue, this paper proposes an SD method named Clean sample guided Self-knowledge Distillation (CleanSD), in which the original clean sample is used as a guide when the model is trained with the augmented samples. The implementation of the CleanSD comes with two DA techniques, namely Mixup (for label-mixing) and Cutout (for label-preserving). Results on CIFAR-100 demonstrate that error rates obtained by the proposed CleanSD are reduced by 2.59%, 1.39%, and 0.47-1.20%, compared to that obtained by the baseline, the vanilla DA techniques, and other peer SD methods, respectively. In addition, the effectiveness and robustness of the CleanSD are verified across multiple DA methods and datasets.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00