VISUAL TRACKING VIA TEMPORALLY-REGULARIZED CONTEXT-AWARE CORRELATION FILTERS
Jiawen Liao, Chun Qi, Jianzhong Cao, He Bian
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 07:08
Classical discriminative correlation filter (DCF) model suffers from boundary effects, several modified discriminative correlation filter models have been proposed to mitigate this drawback using enlarged search region, and remarkable performance improvement has been reported by related papers. However, model deterioration is still not well addressed when facing occlusion and other challenging scenarios. In this work, we propose a novel Temporally-regularized Context-aware Correlation Filters (TCCF) model to model the target appearance more robustly. We take advantage of the enlarged search region to obtain more negative samples to make the filter sufficiently trained, and a temporal regularizer, which restricting variation in filter models between frames, is seamlessly integrated into the original formulation. Our model is derived from the new discriminative learning loss formulation, a closed form solution for multidimensional features is provided, which is solved efficiently using Alternating Direction Method of Multipliers (ADMM). Extensive experiments on standard OTB-2015, TempleColor-128 and VOT-2016 benchmarks show that the proposed approach performs favorably against many state-of-the-art methods with real-time performance of 28fps on single CPU.