Skip to main content

Webly Supervised Deep Attentive Quantization

Jinpeng Wang, Bin Chen, Tao Dai, Shutao Xia

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:05:54
10 Jun 2021

Learning to hash has been widely applied in large-scale image retrieval. Although current deep hashing methods yield state-of-the-art performance, their heavy dependence on ground-truth information actually makes it difficult to deploy in practical applications such as social media. To solve this problem, we propose a novel method termed Webly Supervised Deep Attentive Quantization (WSDAQ), where deep quantization is trained on web images associated with some user-provided weak tags, without consulting any ground-truth labels. Specifically, we design a tag processing module to leverage semantic information of tags so as to better supervised quantization learning. Besides, we propose an end-to-end trainable Attentive Product Quantization Module (APQM) to quantize deep features of images. Furthermore, we use a noise-contrastive estimation loss to train the model from the perspective of contrastive learning. Experiments validate that WSDAQ is superior to state-of-the-art baselines in compact coding trained on weakly-tagged web images.

Chairs:
Soohyun Bae

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00