• Chinese Journal of Lasers
  • Vol. 48, Issue 23, 2304004 (2021)
Fan Yang1、2, Bin Liu1、2、*, Lu Chu1、2, Yan Chi3, Fangfang Han1、2, Na Liu1、2, and Baofeng Zhang1、2
Author Affiliations
  • 1School of Electrical and Electronic Engineering, Tianjin University of Technology, Tianjin 300384, China
  • 2Tianjin Key Laboratory for Control Theory & Applications in Complicated Systems, Tianjin 300384, China;
  • 3School of Economics and Management, Tianjin Chengjian University, Tianjin 300384, China
  • show less
    DOI: 10.3788/CJL202148.2304004 Cite this Article Set citation alerts
    Fan Yang, Bin Liu, Lu Chu, Yan Chi, Fangfang Han, Na Liu, Baofeng Zhang. Binocular Measurement Method Using Grid Structured Light[J]. Chinese Journal of Lasers, 2021, 48(23): 2304004 Copy Citation Text show less

    Abstract

    Objective By projecting structured light to object surface, the feature points for binocular stereo matching could be easily extracted for realizing three-dimensional (3D) measurement. Accurate extraction of the feature points is critical to the precision of 3D measurement. The encoded structured light pattern is usually used to guarantee the extraction accuracy of the feature points. In this case, the encoding modes always require the special design of the structured light pattern. And an adaptive decoding method should be implemented. However, the grid structured light has obvious intersection features which means the encoding and decoding processes are not essential. Thus, the extraction of the feature points could be simplified. In this paper, we proposed a binocular 3D measurement method using the intersections of the grid light pattern as the feature points for stereo matching. The image processing algorithm for extracting the intersections was presented. Then the stereo matching could be easily accomplished by sorting the feature points according to the grid topology. The displacement measurement experiments were carried out to verify the robustness and accuracy of the proposed method. We hope that our work could provide a binocular 3D measurement method with the advantages of high precision, strong robustness, easy to deploy, and low cost.

    Methods A novel feature point extraction algorithm for grid-structured patterns was developed to extract the intersections of the grid-structured patterns in the left and right views. The algorithm implemented a coarse-to-fine process. Firstly, the regions of the feature points were locked by the improved corner extraction method. After that, the fine pixel coordinates of the feature points could be obtained by calculating the intersections of the grid lines crossed in the regions. Furthermore, the maximum value of the measurable depth of the system was analyzed. According to the working characteristics of the system, a suitable method to determine the topological relationship of the feature points was presented even while some points missed in left and right views due to occlusion.

    Results and Discussions For the region location of the feature points, the corner point extraction was performed on the whole image at first. Considering the presence of noise, the common Harris corner point extraction algorithm could fail (Fig. 7). By contrast, the Shi-Tomasi algorithm could produce better results (Fig. 8). However, it is inevitable that false extraction or missing corner may occur due to the image noise (Fig. 8). Then, the density clustering algorithm was applied to solve this problem. The noise simulation experiments proved that the algorithm was capable of eliminating the false extracted points and grouping the corner points accurately (Fig. 9). After that, the extracted corner points grouped by density clustering could determine the region of each feature point [Fig. 10(b) and Fig. 10(f)]. The light stripe center extraction of the horizontal and vertical lines was performed respectively in this region [Fig. 10(c) and Fig. 10(g)]. The equations of the cross lines were fitted through the center points. In this case, the pixel coordinates of each target feature point could be calculated [Fig. 10(d) and Fig. 10(h)]. Furthermore, some feature points may miss due to occlusion which brings difficulty in stereo matching. The idea of region growth was introduced to find the relative sequence relationship between the feature points, which can effectively avoid the problem of sequential coding caused by the absence of feature points (Fig. 16). To verify the robustness and accuracy of the proposed method, the displacement measurement experiments were carried out. The measurement results of the proposed method were compared with those of the grating ruler with the accuracy of 1 μm. The maximum relative error is 2.20% (Fig. 21).

    Conclusions A binocular 3D measurement method using grid structured light was proposed in this paper. The algorithm and effective stereo matching method of the feature points were studied. First, the extraction of the grid corner points was implemented by using the Shi-Tomasi algorithm. Then, the density clustering algorithm was applied to eliminate the false extracted points and group the corner points accurately. After that, each group of corner points defined a region. The light stripe center extraction of the horizontal and vertical lines was performed respectively in the region. The equations of the cross lines were fitted through the center points. In this case, the pixel coordinates of each target feature point could be calculated. Furthermore, a suitable method for determining the topological relationship of the feature points was studied even while some points may miss in left and right views due to occlusion. The verification experiment results of the proposed method were compared with those of the grating ruler with the accuracy of 1 μm. The maximum relative error is 2.20%. And 3D shape measurements of the sheet metal parts with different deformation were implemented using the proposed method.

    Fan Yang, Bin Liu, Lu Chu, Yan Chi, Fangfang Han, Na Liu, Baofeng Zhang. Binocular Measurement Method Using Grid Structured Light[J]. Chinese Journal of Lasers, 2021, 48(23): 2304004
    Download Citation