Skip to main content

Self-Supervised Learning with Explorative knowledge Distillation

Tongtong Su (Nankai Univerisity); Jinsong Zhang (Nankai Univerisity); Wang Gang (Nankai Univerisity); Liu Xiaoguang (Nankai Univerisity)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Previous paradigms have combined self-supervised learning (SSL) with knowledge distillation to compress a selfsupervised teacher model into a smaller student. In this work, we devise a self-supervised explorative distillation (SSED) algorithm to improve the representation quality of the lightweight models. We introduce a heterogeneous teacher to maximumly learn rich feature representation for the student, which reaches the expected goal of capturing discriminative feature information contained in network itself. SSED enforces the student to learn more diversified and perfect representations of the original class recognition task and self-supervised learning task. Extensive experiments show that SSED improves accuracy effectively on large and small models, and surpassing current top-performing SSL methods. Particularly, the linear results of our ResNet-18, trained with ResNet-50 teacher, achieves 65.5% ImageNet top-1 accuracy, which is 1.4% and 4.9% higher than OSS and DisCo. Code is available at https://github.com/nanxiaotong/SSED.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00