Skip to main content

SEMANTIC CENTRALIZED CONTRASTIVE LEARNING FOR UNSUPERVISED HASHING

Fengming Liang (Beijing University of Posts and Telecommunications); Changlin Fan (Beijing University of Posts and Telecommunications); Bo Xiao (Beijing University of Posts and Telecommunications); Kongming Liang (Beijing University of Posts and Telecommunications)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Jun 2023

Contrastive learning has shown its potential in many unsupervised tasks, including hashing. However, the representations obtained by contrastive learning generally fail to produce notable margins between semantic classes. Different semantic samples around the boundary are likely to collide into the same hash code. In this paper, we propose a novel Semantic Centralized Contrastive Hashing (SCCH) to allow the learned features closer to their semantic centers and more applicable to hashing. Specifically, a semantic centralization strategy is proposed by pulling strongly augmented samples towards weakly augmented ones since the weak are closer to semantic centers than the strong. Moreover, quantization directly after contrastive learning would damage the learned similarity relationship. We provide a solution to eliminate the mismatch of similarity metrics between contrastive learning and hashing mapping. Extensive experiments on three benchmark datasets demonstrate that SCCH outperforms the existing state-of-the-art methods.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00