• Acta Optica Sinica
  • Vol. 38, Issue 10, 1015002 (2018)
Changzhen Xiong1、*, Manqiang Che1, Runling Wang2, and Yan Lu1
Author Affiliations
  • 1 Beijing Key Laboratory of Urban Intelligent Control, Beijing 100144, China
  • 2 College of Sciences, North China University of Technology, Beijing 100144, China
  • show less
    DOI: 10.3788/AOS201838.1015002 Cite this Article Set citation alerts
    Changzhen Xiong, Manqiang Che, Runling Wang, Yan Lu. Robust Real-Time Visual Tracking via Dual Model Adaptive Switching[J]. Acta Optica Sinica, 2018, 38(10): 1015002 Copy Citation Text show less

    Abstract

    In order to improve the real-time and robust performances of the convolutional features for visual tracking, a real-time tracking method of dual model adaptive switching is proposed based on the analysis of different convolution layer features for different object representation capabilities. This method utilizes the feature energy ratio of the object region and searches region to evaluate the features selected from two convolutional layers. The convolution channel whose energy ratio value is greater than the given threshold is selected to train two correlated filter classifiers. Consequently, the object position is predicted by switching correlation filter classifiers using the peak-to-sidelobe ratio of response map adaptively. Finally, the sparse model update strategy is applied to update the classifiers. The proposed algorithm is tested on the standard dataset. The experimental results show that the average distance accuracy of proposed algorithm is 89.3%, which is close to continuous convolution object tracking, and the average tracking speed is 25.8 frame/s, which is 25 times faster than the continuous convolution object tracking algorithm. The overall performance of the proposed algorithm outperforms other tracking methods contrasted in the experiment.
    Changzhen Xiong, Manqiang Che, Runling Wang, Yan Lu. Robust Real-Time Visual Tracking via Dual Model Adaptive Switching[J]. Acta Optica Sinica, 2018, 38(10): 1015002
    Download Citation