Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:09:07
08 Jun 2021

Object detection models have achieved increasingly better performance based on more complex architecture designs, but the heavy computation limits their further widespread application on the devices with insufficient computational power. To this end, we propose a novel Object-Oriented Relational Distillation (OORD) method that drives small detection models to have an effective performance like large detection models with constant efficiency. Here, we introduce to distill relative relation knowledge from teacher/large models to student/small models, which promotes the small models to learn better soft feature representation by the guiding of large models. OORD consists of two parts, i.e., Object Extraction (OE) and Relation Distillation (RD). OE extracts foreground features to avoid background feature interference, and RD distills the relative relations between the foreground features through graph convolution. Related experiments conducted on various kinds of detection models show the effectiveness of OORD, which improves the performance of the small model by nearly 10% without additional inference time cost.

Chairs:
Karl Ni

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00