SELF-DISTILLATION HASHING FOR EFFICIENT HAMMING SPACE RETRIEVAL
Hongjia HJ Zhai (Zhejiang University); Hai Li (Zhejiang University); hanzhi zhang (Zhejiang University); Hujun Bao (Zhejiang University); Guofeng Zhang (Zhejiang University)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Deep hashing-based approaches have become the optimal solutions for large-scale image retrieval task due to their high computational efficiency and low storage burden.
Some methods leverage a large teacher network to improve the retrieval performance of the small student network through knowledge distillation, which incurs high computational and time costs. In this paper, we propose Self-Distillation Hashing (SeDH), which improves the image retrieval performance without introducing a complex teacher model and significantly reduces the overall computation costs.
Specifically, we generate the soft targets via ensembling the logits of other similar images among the mini-batch. The ensembled soft targets can model the relations between different image samples, which can act as additional supervision for classification.
Besides, to learn more compact features and accurate inter-sample similarities, we propose a similarity-preserving loss on the learned hashing features, which aligns the softened similarity distribution with the pairwise soft similarity.
Extensive experiments demonstrate that our approach can yield state-of-the-art performance on deep supervised hashing retrieval.