• Laser & Optoelectronics Progress
  • Vol. 56, Issue 22, 221503 (2019)
Yueyang Yu1、2、3、4、5、*, Zelin Shi1、2、3、4、5, and Yunpeng Liu2、3、4、5
Author Affiliations
  • 1School of Information Science and Technology, University of Science and Technology of China, Hefei, Anhui 230026, China
  • 2Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, Liaoning 110016, China
  • 3Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang, Liaoning 110016, China
  • 4Key Laboratory of Opto-Electronic Information Processing, Chinese Academy of Science, Shenyang, Liaoning 110016, China
  • 5Key Laboratory of Image Understanding and Computer Vision, Shenyang, Liaoning 110016, China
  • show less
    DOI: 10.3788/LOP56.221503 Cite this Article Set citation alerts
    Yueyang Yu, Zelin Shi, Yunpeng Liu. Foreground-Aware Based Spatiotemporal Correlation Filter Tracking Algorithm[J]. Laser & Optoelectronics Progress, 2019, 56(22): 221503 Copy Citation Text show less

    Abstract

    In this study, we propose a foreground-aware based spatiotemporal correlation filter algorithm based on the spatially regularized discriminative correlation filter (SRDCF) to deal with long-term object tracking failures caused by background clutter, occlusions, and out-of-view objects. Initially, a foreground-aware correlation filtering algorithm is proposed to distinguish the foreground and background of the object accurately. Subsequently, the foreground-aware filter is added to the time regularization term to keep the filter with spatiotemporal regularization function in a low-dimensional discriminative manifold. Simultaneously, the solution based on the alternating direction method of multipliers (ADMM) is conducted to achieve real-time operation of the tracking method in the traditional feature expression. Finally, the activation threshold of object re-detector is determined, and the candidate region method combined with correlation filtering method is used to achieve re-detection, so as to achieve the purpose of long-term tracking. We conduct experiments using traditional and convolutional features with respect to the OTB2013 standard dataset and observe that the average success rates of tracking are 5.6% and 7% higher, respectively, when compared with that of SRDCF. Therefore, the proposed approach is a robust method for handling background blur, rotations, occlusions, and out-of-view objects.
    Yueyang Yu, Zelin Shi, Yunpeng Liu. Foreground-Aware Based Spatiotemporal Correlation Filter Tracking Algorithm[J]. Laser & Optoelectronics Progress, 2019, 56(22): 221503
    Download Citation