Skip to main content

Adaptable Ensemble Distillation

Yankai Wang, Dawei Yang, Wei Zhang, Zhe Jiang, Wenqiang Zhang

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:06:37
09 Jun 2021

Online knowledge distillation (OKD), which simultaneously trains several peer networks to construct a powerful teacher on on-the-fly, has drawn much attention in recent years. OKD is designed to simplify the training procedure of conventional offline distillation. However, the ensemble strategy of existing OKD methods is inflexible and highly relies on random initial- izations. In this paper, we propose Adaptable Ensemble Distil- lation (AED) that inherits the merits of existing OKD methods while overcoming their major drawbacks. The novelty of our AED lies in three aspects: (1) an individual-regulated mech- anism is proposed to flexibly regulate individual model and further generates an online ensemble with strong adaptability; (2) a diversity-aroused loss is designed to explicitly diversify individual models, which enhances the robustness of the en- semble; (3) an empirical distillation technique is adopted to directly promote knowledge transfer in OKD framework. Ex- tensive experiments show that our proposed AED consistently outperforms the existing state-of-the-art OKD methods on various datasets.

Chairs:
C.-C. Jay Kuo

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00