• Laser & Optoelectronics Progress
  • Vol. 57, Issue 3, 030102 (2020)
Shaojie Chen1、2、*, Zhencai Zhu3, Yonghe Zhang1, Ming Guo1、**, and Shuai Zhi1、***
Author Affiliations
  • 1Innovation Academy for Microsatellite, Chinese Academy of Sciences, Shanghai 201203, China
  • 2University of Chinese Academy of Sciences, Beijing 100049, China
  • 3Innovation Academy for Microsatellites of CAS Key Laboratory, Shanghai 201203, China
  • show less
    DOI: 10.3788/LOP57.030102 Cite this Article Set citation alerts
    Shaojie Chen, Zhencai Zhu, Yonghe Zhang, Ming Guo, Shuai Zhi. Extrinsic Calibration for Lidar and Stereo Vision Using 3D Feature Points[J]. Laser & Optoelectronics Progress, 2020, 57(3): 030102 Copy Citation Text show less

    Abstract

    Lidar and stereo cameras are important environmental sensors for unmanned driving. Calibrating external parameters between these two sensors is an important basis for their combination; however, combining two types of information requires a complex calibration process. This paper proposes a method based on feature point pair matching. Two rectangular planks are used to extract the 3D point cloud of the edge of the board in stereo vision and lidar coordinate systems, which is then used to obtain the corner coordinates. Finally, the Kabsch algorithm is used to solve the coordinate transformation between the paired feature points, and a clustering method is used to remove outliers from the multiple measurements and obtain the average value. By setting up an experiment, this method can be implemented on the Nvidia Jetson Tx2 embedded development board, and accurate registration parameters can be obtained, thus verifying the theoretical method’s feasibility. This registration method is simple and easy to execute, can automatically perform multiple measurements, and is improved compared with similar methods.
    Shaojie Chen, Zhencai Zhu, Yonghe Zhang, Ming Guo, Shuai Zhi. Extrinsic Calibration for Lidar and Stereo Vision Using 3D Feature Points[J]. Laser & Optoelectronics Progress, 2020, 57(3): 030102
    Download Citation