Author Affiliations
1 College of Electronic Countermeasures, National University of Defense Technology, Hefei, Anhui 230037, China2 State Key Laboratory of Pulsed Power Laser Technology, Hefei, Anhui 230037, China3 South-West Electron and Telecom Technology Institute, Chengdu, Sichuan 610041, Chinashow less
Fig. 1. Tracking drift using several classic tracking methods. (a) Algorithm in Ref. [9]; (b) algorithm in Ref. [14]; (c) algorithm in Ref. [15]; (d) algorithm in Ref. [16]
Fig. 2. Infrared images dataset (examples). (a) Bicycle; (b) bus; (c) car; (d) motorbike; (e) pedestrian
Fig. 3. Training loss versus number of iterations
Fig. 4. Test accuracy versus number of iterations
Fig. 5. Decision-level fusion tracking for infrared and visible spectra based on deep learning
Fig. 6. Process of decision-level fusion tracking
Fig. 7. Tracking results(frame sequence number is 1, 15, 32, 55, 129, 143, 162). (a) Infrared tracking; (b) visible tracking; (c) fusion tracking infrared and visible
Fig. 8. Comparison of overlap score between dual-band fusion and single band tracking
Fig. 9. Comparison of centre location error between dual-band fusion and single band tracking
mAP | AP |
---|
Bicycle | Bus | Car | Motorbike | Person |
---|
0.823 | 0.788 | 0.896 | 0.864 | 0.806 | 0.758 |
|
Table 1. mAP of five classes on infrared test datasets
Performance | Visibletracking | Infraredtracking | Fusiontracking |
---|
Average overlapscore /% | 61.8 | 66.4 | 74.0 | Average centrelocation error /pixel | 14.6 | 8.8 | 4.0 | Target loss rate | 0.23 | 0.11 | 0 |
|
Table 2. Performance comparison between dual-band fusion tracking and single band tracking