• Laser & Optoelectronics Progress
  • Vol. 59, Issue 2, 0215003 (2022)
Wensong Song, Zonghua Zhang*, Nan Gao, and Zhaozong Meng
Author Affiliations
  • School of Mechanical Engineering, Hebei University of Technology, Tianjin 300130, China
  • show less
    DOI: 10.3788/LOP202259.0215003 Cite this Article Set citation alerts
    Wensong Song, Zonghua Zhang, Nan Gao, Zhaozong Meng. Spatial Pose Calibration Method for Lidar and Camera Based on Intensity Information[J]. Laser & Optoelectronics Progress, 2022, 59(2): 0215003 Copy Citation Text show less

    Abstract

    Lidar and camera fusion system can perceive the geometric size and color information of the environment, which has been widely used in many fields. In order to accurately fuse the two kinds of information, we propose a calibration method of external parameters of lidar and camera based on natural feature points. First, on the basis of lidar self correction, the gray image is generated by central projection of the point cloud using the intensity information of lidar data. Then, the scale-invariant feature transformation algorithm is used to extract and match the feature points of the gray image generated by projection and the camera image. Finally, the calibration mathematical model is established based on the information obtained from the feature points with the same name, and the data are optimized to calibrate the external parameters of the three-dimensional lidar system and camera system. The experimental results show that the re projection error from the point cloud to the image pixels calculated by this method is 2.3 pixel, which verifies the effectiveness and accuracy of the pose calibration method.
    Wensong Song, Zonghua Zhang, Nan Gao, Zhaozong Meng. Spatial Pose Calibration Method for Lidar and Camera Based on Intensity Information[J]. Laser & Optoelectronics Progress, 2022, 59(2): 0215003
    Download Citation