• Opto-Electronic Engineering
  • Vol. 46, Issue 7, 180420 (2019)
Chang Xin1、2, Chen Xiaodong1、2、*, Zhang Jiachen1、2, Wang Yi1、2, and Cai Huaiyu1、2
Author Affiliations
  • 1[in Chinese]
  • 2[in Chinese]
  • show less
    DOI: 10.12086/oee.2019.180420 Cite this Article
    Chang Xin, Chen Xiaodong, Zhang Jiachen, Wang Yi, Cai Huaiyu. An object detection and tracking algorithm based on LiDAR and camera information fusion[J]. Opto-Electronic Engineering, 2019, 46(7): 180420 Copy Citation Text show less

    Abstract

    As an important part of intelligent vehicle, environmental perception system mainly refers to the detection of the surrounding environment of the vehicle by the sensors attached on the vehicle. In order to ensure the accuracy and stability of the intelligent vehicle environmental perception system, it is necessary to use intelligent vehicle sensors to detect and track objects in the passable area. In this paper, an object detection and tracking algorithm based on the LiDAR and camera information fusion is proposed. The algorithm uses the point cloud data clustering method of LiDAR to detect the objects in the passable area and project them onto the image to determine the tracking objects. After the objects are determined, the algorithm uses color information to track objects in the image sequence. Since the object tracking algorithm based on image is easily affected by light, shadow and background interference, the algorithm uses LiDAR point cloud to modify the tracking results. This paper uses KITTI data set to verify and test this algorithm and experiments show that the target area detection overlap of the proposed target detection and tracking algorithm is 83.10% on average and the tracking success rate is 80.57%. Compared with particle filtering algorithm, the average region overlap increased by 29.47% and the tracking success rate increased by 19.96%.
    Chang Xin, Chen Xiaodong, Zhang Jiachen, Wang Yi, Cai Huaiyu. An object detection and tracking algorithm based on LiDAR and camera information fusion[J]. Opto-Electronic Engineering, 2019, 46(7): 180420
    Download Citation