Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 07:23
28 Oct 2020

Attention mechanisms have been widely used in deep neural convolution networks and different fields, such as object detection and instance segmentation. Many attention mechanisms will cost too much calculation, so in this paper, we incorporate a kind of light attention mechanism, the attended residual module, into our object detection backbone to get an accuracy-efficiency trade-off. Besides, to solve the imbalance problem in region sample level, we use the cascade region proposal network(RPN) module to gain anchors of higher quality resulting in higher average recall(AR). Furthermore, we replace the non-local attention module in feature fusion level with the criss-cross attention module to reduce computation and improve performance. With them all, our method significantly improves the detection performance and achieves 43.6 AP in COCO test-dev.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00