Progressive-Granularity Retrieval via Hierarchical Feature Alignment for Person Re-Identification
Zhaopeng Dou, Zhongdao Wang, Yali Li, Shengjin Wang
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:08:35
Person re-identification (re-ID) aims to match pedestrian images from non-overlapping cameras. It is a challenging task because of the feature misalignment problem caused by occlusion. In this paper, inspired by the coarse-to-fine nature of human perception, we propose a novel Progressive-Granularity Retrieval (PGR) method to tackle this issue. Specifically, \textit{(i)} we define instance-level, part-level and pixel-level features for an image. PGR learns these features by a single feature extractor to capture hierarchical clues in the image. \textit{(ii)} These features are inherently related but different in perceptual granularity, and they can provide complementary information. For each type of feature, we propose a corresponding similarity metric to achieve hierarchical feature alignment. \textit{(iii)} In training, we learn the model end-to-end. In inference, a progressive retrieval strategy is introduced to efficiently aggregate the complementary information provided by these features. Extensive experiments on three benchmarks of both occluded and holistic-body re-ID tasks show the effectiveness of the proposed method. Especially, our method significantly outperforms state-of-the-art by 4.5% Rank-1 score on the challenging Occluded-Duke dataset.