• Acta Optica Sinica
  • Vol. 38, Issue 11, 1115003 (2018)
Daqian Liu1、*, Wanjun Liu2, and Bowen Fei3
Author Affiliations
  • 1 School of Electronic and Information Engineering, Liaoning Technical University, Huludao, Liaoning 125105, China
  • 2 School of Software, Liaoning Technical University, Huludao, Liaoning 125105, China
  • 3 School of Business and Management, Liaoning Technical University, Huludao, Liaoning 125105, China
  • show less
    DOI: 10.3788/AOS201838.1115003 Cite this Article Set citation alerts
    Daqian Liu, Wanjun Liu, Bowen Fei. Target Tracking Method Based on Location-Classification-Matching Model[J]. Acta Optica Sinica, 2018, 38(11): 1115003 Copy Citation Text show less

    Abstract

    Recently, the framework of convolution neural network has been successfully applied to the target tracking, and has achieved robust tracking results. On the basis of this conception, a target tracking method based on location-classification-matching model is proposed. First of all, in the location model,the candidate target region of the current frame is predicted by using location information of previous frame. Secondly, the trained deepth features are used to inter-class screen the candidate regions, and N sub-optimal target regions are selected. Finally, we use conventional color features to perform intra-class optimization matching for sub-optimal target regions, so as to determine the final tracking target. Meanwhile, the network in the location and the classification is updated separately, and the established target model is updated online and real-time to ensure that the model describes the target accurately. Experimental tests are performed on OTB50 and OTB100 standard databases, the experimental results show that the proposed tracking method has better tracking robustness under the conditions of fast motion, similar object interference, and complex background.
    Daqian Liu, Wanjun Liu, Bowen Fei. Target Tracking Method Based on Location-Classification-Matching Model[J]. Acta Optica Sinica, 2018, 38(11): 1115003
    Download Citation