Skip to main content

EXPLORING EFFECTIVE KNOWLEDGE DISTILLATION FOR TINY OBJECT DETECTION

Haotian Liu, Qing Liu, Yang Liu, Yixiong Liang, Guoying Zhao

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
Poster 11 Oct 2023

Detecting tiny objects is a long-standing and critical problem in object detection, with broad real-world applications such as autonomous driving, surveillance, and medical diagnosis. Recent studies for tiny object detection often cause extra computational costs during inference due to introducing feature maps with increased resolution or additional network modules. This scarifies the inference speed for better detection accuracy and may heavily limit their availability to real-world applications. Therefore, this paper turns to knowledge distillation to improve the representation learning of a small model regarding both superior detection accuracy and fast inference speed. The masked scale-aware feature distillation and local attention distillation are proposed to address the critical issues in the distillation of tiny objects. Experimental results on two tiny benchmarks indicate that our method can bring noticeable performance gains to different detectors while keeping their original inference speeds. Our method also shows competitive performance compared to state-of-the-art methods for tiny object detection. Our code is available at https://github.com/haotianll/TinyKD.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00