• Laser & Optoelectronics Progress
  • Vol. 58, Issue 16, 1615002 (2021)
Zaifeng Shi1、*, Cheng Sun1、**, Qingjie Cao2, Zhe Wang1, and Qiangqiang Fan1
Author Affiliations
  • 1School of Microelectronics, Tianjin University, Tianjin 300072, China
  • 2School of Mathematical Sciences, Tianjin Normal University, Tianjin 300072, China
  • show less
    DOI: 10.3788/LOP202158.1615002 Cite this Article Set citation alerts
    Zaifeng Shi, Cheng Sun, Qingjie Cao, Zhe Wang, Qiangqiang Fan. Multi-Task Learning Tracking Method Based on the Similarity of Dynamic Samples[J]. Laser & Optoelectronics Progress, 2021, 58(16): 1615002 Copy Citation Text show less

    Abstract

    To solve the problem of noisy samples easily interfering with the online updating tracking method and resulting in a drift phenomenon, a method suitable for long-term tracking is proposed, and the proposed method is combined with a multi-task learning training mode and a loss detection step is added into the tracking process. The proposed method constantly collects the appearance of the target during tracking to construct a dynamic sample set, which detects the loss of target according to sample similarity to reduce the tracker’s learning of noisy samples; further, the dynamic threshold is used to adapt to different targets. To make the tracker build a complete model of the target appearance, short- and long-term memory subtasks are jointly trained. During redetection, after the target is lost, regions are proposed based on regional outline features and scale information about the target to improve the quality of target redetection. The proposed method is evaluated on the object tracking datasets OTB-2015 and VOT-2016, and the tracker has an accuracy of 90.8% and a success rate of 68.1%. Experimental results show that the proposed method can effectively track a target in complex scenes, such as occlusion.
    Zaifeng Shi, Cheng Sun, Qingjie Cao, Zhe Wang, Qiangqiang Fan. Multi-Task Learning Tracking Method Based on the Similarity of Dynamic Samples[J]. Laser & Optoelectronics Progress, 2021, 58(16): 1615002
    Download Citation