Sample-aware Knowledge Distillation for Long-tailed Learning
Shanshan Zheng (Xiamen University); Yachao Zhang (Tsinghua University); Yanyun Qu (XMU); hongyi huang (XMU)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Image classification for long-tailed scenarios has attracted more attention because its distribution is more similar to real-world image data. From the perspective of solving imbalance at the sample level, we propose a simple but effective method, named Sample-aware Knowledge Distillation, which includes Selective Knowledge Distillation module and Stable Feature Center Learning module. The former conducts knowledge distillation at the sample-level by selecting samples, in which whether the sample needs to be distilled and to what extent is determined by evaluating the teacher network's predictions for this sample. The latter is used to obtaining the stable feature center and making the feature center free from perturbation by hard samples, then further improving the classification boundary. We conduct extensive experiments on several long-tailed benchmark datasets and these results demonstrate that SAKD is effective. In addition, our SFCL module can be combined with other methods and also improve their performance.