Abstract:
Most existing sparse representation based trackers only use a single feature to describe the objects of interest and tend to be unstable when processing challenging videos. To address this issue, we propose a particle filter tracker based on multiple feature joint sparse representation. The main idea of our algorithm is to partition each particle region into multiple overlapped image fragments. Eevery local fragment of candidates is sparsely represented as a linear combination of all the atoms of dictionary template that is updated dynamically and is merely reconstructed by the local fragments of dictionary template located at the same position. The weights of particles are determined by their reconstruction errors to realize the particle filter tracking. Our method simultaneously enforces the structural sparsity and considers the interactions among particles by using mixed norms regularization. We further extend the sparse representation module of our tracker to a multiple kernel joint sparse representation module which is efficiently solved by using a kernelizable accelerated proximal gradient (KAPG) method. Both qualitative and quantitative evaluations demonstrate that the proposed algorithm is competitive to the state-of-the-art trackers on challenging benchmark video sequences with occlusion, rotation, shifting and illumination changes.