FUSION OF SALIENCY MAP AND DEEP FEATURE-BASED CORRELATION FILTER FOR ENHANCING TRACKING PERFORMANCES
Hyemin Lee, Daijin Kim
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 13:00
This paper proposes the fusion of a saliency map and a deep feature-based correlation filter to enhance tracking accuracy by reflecting spatial attention and foreground information in the tracking process. The saliency map enables the tracker to focus on a salient object region. The target foreground region is roughly segmented from the background region using a pixel-wise likelihood map derived from the color model and the shape model. Given that only visual information in the foreground region is used, the model is prevented from learning background, and the tracker is made robust to background change. The proposed saliency map can be easily combined with any tracking methods by providing the map as the weight value to the target response map. The saliency map is combined with a correlation filter-based tracker, and we prove that the saliency map successfully improves tracking performance. Experiments are conducted to validate the proposed method on public benchmark datasets. The proposed method achieves remarkable results compared with existing state-of-the-art trackers and successfully improves the tracking performance combined with two version of correlation filter-based methods.