ISSN 1000-1239 CN 11-1777/TP

• 图形图像 •

### 基于自适应空间正则化的视觉目标跟踪算法

1. (湖南大学电气与信息工程学院 长沙 410082) (机器人视觉感知与控制技术国家工程实验室(湖南大学) 长沙 410082) (tanjianhao@hnu.edu.cn)
• 出版日期: 2021-02-01
• 基金资助:
国家自然科学基金项目(61433016)；湖南省科技创新计划项目(2017XK2102)

### Visual Tracking Algorithm Based on Adaptive Spatial Regularization

Tan Jianhao, Zhang Siyuan

1. (College of Electrical and Information Engineering, Hunan University, Changsha 410082) (National Engineering Laboratory of Robot Visual Perception and Control Technology (Hunan University), Changsha 410082)
• Online: 2021-02-01
• Supported by:
This work was supported by the National Natural Science Foundation of China (61433016) and the Science and Technology Innovation Program of Hunan Province (2017XK2102).

Abstract: In the visual tracking algorithm based on correlation filters, the method of generating sample sets by cyclic shift greatly reduces the amount of calculation. However, it will also bring about boundary effects, and the resulting error samples will weaken the discriminative ability of the classifier. In order to solve the above problem, a visual tracking algorithm based on adaptive spatial regularization is proposed. An adaptive spatial regularization term is introduced into the classic filtering model. By establishing the correlation of regularization weights between adjacent frames, the regularization weights of the model can be adaptively adjusted. In this way, the risk of overfitting when processing unreal samples can be reduced, thereby mitigating the boundary effect. We adopt a scale estimation strategy with adaptive aspect ratio, which can accurately track the scale change of the target. In addition, the update strategy based on the similarity of color histograms is used to avoid the model update when the tracking is inaccurate, thereby suppressing model drift and improving tracking accuracy and speed. Experiments show that the success rate and accuracy of our algorithm on UAV123, OTB2013, OTB2015 are higher than all the compared algorithms. And even in various complex scenes, our algorithm can still maintain a high tracking success rate. Especially in the presence of motion blur and in-plane rotation, the success rate scores are 9.72% and 9.03% higher than the second best algorithm, respectively, which shows that the algorithm has good adaptability.