• Journal of Infrared and Millimeter Waves
  • Vol. 35, Issue 4, 496 (2016)
ZHENG Chao1、2、3、*, CHEN Jie4, YANG Xing1、2、3, YIN Song-Feng1、2、3, and FENG Yun-Song1、2、3
Author Affiliations
  • 1[in Chinese]
  • 2[in Chinese]
  • 3[in Chinese]
  • 4[in Chinese]
  • show less
    DOI: 10.11972/j.issn.1001-9014.2016.04.019 Cite this Article
    ZHENG Chao, CHEN Jie, YANG Xing, YIN Song-Feng, FENG Yun-Song. Adaptive fusion tracking based on optimized co-training framework[J]. Journal of Infrared and Millimeter Waves, 2016, 35(4): 496 Copy Citation Text show less

    Abstract

    As analytical fusion tracking algorithms based on visible and infrared images always have low robustness in complex environment, a novel adaptive analytical fusion tracking algorithm based on optimized co-training framework was proposed. Firstly, selecting the most discriminative weak classifiers from weak classifier pools based on infrared and visible images respectively are achieved by weighted multiple instance learning boosting technology, which relieving classifiers’ discriminative capacity decreasing owing to the added error positive samples. Then, classifiers’ sample bags are updated by co-training criterion under the help of adaptive prior knowledge import strategy. Lastly, efficiency analysis of the proposed algorithm was achieved based on error model. Comparative experiments on multiple sequences tracking show the contributions for improving tracking robustness from different parts of the proposed algorithm, and then, demonstrate that it outperforms state-of-the-art tracking algorithms based on single source image or other fusion schemes on robustness.
    ZHENG Chao, CHEN Jie, YANG Xing, YIN Song-Feng, FENG Yun-Song. Adaptive fusion tracking based on optimized co-training framework[J]. Journal of Infrared and Millimeter Waves, 2016, 35(4): 496
    Download Citation