Skip to main content

UNSUPERVISED DEEP HASHING WITH DEEP SEMANTIC DISTILLATION

Chuang Zhao, Hefei Ling, Yuxuan Shi, Bo Gu, Shijie Lu, Ping Li, Qiang Cao

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
Poster 11 Oct 2023

Many existing unsupervised hashing methods aim to preserve as much semantic information as possible by reconstructing the input data. However, this approach can result in the hash code preserving a lot of redundant information. Besides, previous works usually adopt local structures to guide hashing learning, which will mislead hashing model due to the large amount of noise existing in the local structure. In this paper, we propose a novel Deep Semantic Distillation Hashing (DSDH) to solve the above problems. Specifically, to ensure that the hashing model focuses on preserving more discriminative information rather than background noise, we use random masked images as input for feature extraction. We then apply empirical Maximum Mean Discrepancy to match the output feature distribution with that of the original image. Additionally, to avoid misleading, we propose to constrain the consistency of the similarity distributions of the two spaces from the perspective of similarity distributions, thus transferring the knowledge of the feature space to Hamming space. Experiments conducted on three benchmarks show the superiority of DSDH.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00