Skip to main content

RANSP: Ranking Attention Network for Saliency Prediction on Omnidirectional Images

dandan zhu, yongqing chen, Tian Han, defang zhao, Yucheng Zhu, qiangqiang zhou, Guangtao Zhai, Xiaokang Yang

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 05:19
08 Jul 2020

Various convolutional neural network (CNN)-based methods have shown the ability to boost the performance of saliency prediction on omnidirectional images (ODIs). However, these methods are limited by sub-optimal accuracy, because not all the features extracted by the CNN model are not useful for the final fine-grained saliency prediction. Features are redundant and have negative impact on the final fine-grained saliency prediction. To tackle this problem, we propose a novel Ranking Attention Network for saliency prediction (RANSP) of head fixations on ODIs. Specifically, the part-guided attention (PA) module and channel-wise feature (CF) extraction module are integrated in a unified framework and are trained in an end-to-end manner for fine-grained saliency prediction. To better utilize the channel-wise feature map, we further propose a new Ranking Attention Module (RAM), which automatically ranks and selects these maps based on scores for fine-grained saliency prediction. Extensive experiments are conducted to show the effectiveness of our method for saliency prediction of ODIs.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00