• Infrared and Laser Engineering
  • Vol. 50, Issue 12, 20210130 (2021)
Jing Wang1, Liang Wei1, Wenhao Xiang2, Guiyang Zhang3, and Ju Huo1、*
Author Affiliations
  • 1School of Electrical Engineering and Automation, Harbin Institute of Technology, Harbin 150001, China
  • 2Systems Engineering Research Institute, Beijing 100094, China
  • 3School of Astronautics, Harbin Institute of Technology, Harbin 150001, China
  • show less
    DOI: 10.3788/IRLA20210130 Cite this Article
    Jing Wang, Liang Wei, Wenhao Xiang, Guiyang Zhang, Ju Huo. High-precision camera calibration method considering projected circular edge blur and eccentricity error[J]. Infrared and Laser Engineering, 2021, 50(12): 20210130 Copy Citation Text show less
    Ideal step edge model
    Fig. 1. Ideal step edge model
    Grayscale distribution of projected ellipse
    Fig. 2. Grayscale distribution of projected ellipse
    Two-dimensional representation of edge gray distribution model
    Fig. 3. Two-dimensional representation of edge gray distribution model
    Spatial circular projection model
    Fig. 4. Spatial circular projection model
    Projection ellipse fitting algorithm
    Fig. 5. Projection ellipse fitting algorithm
    Simulative image
    Fig. 6. Simulative image
    Error distribution of =0.002的误差分布
    Fig. 7. Error distribution of =0.002的误差分布
    Edge and center localization based on improved Zernike moments
    Fig. 8. Edge and center localization based on improved Zernike moments
    Acquisition equipment of stereo camera system image
    Fig. 9. Acquisition equipment of stereo camera system image
    Extraction and coding of circular mark points
    Fig. 10. Extraction and coding of circular mark points
    Target image acquisition and center extraction
    Fig. 11. Target image acquisition and center extraction
    Stereo vision system calibration results
    Fig. 12. Stereo vision system calibration results
    Calibration rod
    Fig. 13. Calibration rod
    Locate methodType of noise
    No noiseWhite noise( ${\sigma ^{\rm{2}}}$=0.002) White noise( ${\sigma ^{\rm{2}}}$=0.004)
    Mean error/pixelVariance/pixel2Mean error/pixelVariance/pixel2Mean error/pixelVariance/pixel2
    Gray moment0.06660.01720.08070.02320.12180.1979
    Fit method0.07270.03630.08210.04430.32770.1843
    Zernike moment0.05010.01980.08670.03070.10640.2512
    Proposed method0.04690.01300.05640.03120.06870.0866
    Table 1. Edge location accuracy of simulative image
    Locate method Type of noise
    No noiseWhite noise( ${\sigma ^{\rm{2}}}$=0.002) White noise( ${\sigma ^{\rm{2}}}$=0.004)
    Fitting centerError/pixelFitting centerError/pixelFitting centerError/pixel
    Gray moment(256.034,256.025)0.0422(256.024,255.951)0.0546(256.031,256.056)0.0640
    Fit method(255.994,256.041)0.0417(255.967,256.165)0.1683(256.226,256.186)0.2926
    Zernike moment(255.964,256.022)0.0414(255.959,256.014)0.0430(255.95,256.012)0.0511
    Proposed method(256.027,255.991)0.0289(256.025,255.979)0.0326(256.031,255.986)0.0339
    Table 2. Center location accuracy of simulative image
    Left camera’s intrinsicsRight camera’s intrinsicsExtrinsics
    ${f_x} = {\rm{3051}}{\rm{.903\;4}}$${f_y} = {\rm{3051}}{\rm{.349\;0}}$${u_0} = {\rm{836}}{\rm{.169\;0}}$${v_0} = {\rm{910}}{\rm{.385\;3}}$${k_1} = - {\rm{0}}{\rm{.176\;9}}$${k_2} = {\rm{0}}{\rm{.189\;4}}$$s = 0$${f_x} = {\rm{3\;085}}{\rm{.834\;1}}$${f_y} = {\rm{3\;087}}{\rm{.418\;6}}$${u_0} = {\rm{812}}{\rm{.227\;4}}$${v_0} = {\rm{859}}{\rm{.930\;4}}$${k_1} = - {\rm{0}}{\rm{.182\;0}}$${k_2} = {\rm{0}}{\rm{.375\;9}}$$s = 0$$\begin{gathered} R = \left[ {\begin{array}{*{20}{c}} {0.965\;3}&{0.006\;2}&{ - 0.261\;1} \\ { - 0.040\;2}&{0.991\;3}&{ - 0.125\;1} \\ {0.258\;1}&{0.131\;2}&{0.957\;2} \end{array}} \right] \\ T = \left[ {\begin{array}{*{20}{c}} {{\rm{ - 363}}{\rm{.001\;5}}}&{{\rm{14}}{\rm{.625\;0}}}&{{\rm{86}}{\rm{.082\;6}}} \end{array}} \right] \\ \end{gathered} $
    Table 3. Camera internal parameters and binocular posture relationship
    MethodFitting center AFitting center BLength/mmError/mm
    Gray moment(317.588,−237.967,1250.752)(−223.332,9.369,1172.507)599.90980.2642
    Fit method(317.607,−237.573,1250.357)(−223.792,9.302,1173.957)599.91420.2598
    Zernike moment(317.624,−237.219,1251.199)(−223.726,9.496,1173.864)599.92400.2500
    Hough transform(317.701,−237.186,1250.309)(−223.912,9.357,1174.357)599.91470.2593
    Proposed method(317.486,−238.568,1249.487)(−223.325,9.329,1171.598)599.99680.1772
    Table 4. Calibration rod measurement accuracy
    Jing Wang, Liang Wei, Wenhao Xiang, Guiyang Zhang, Ju Huo. High-precision camera calibration method considering projected circular edge blur and eccentricity error[J]. Infrared and Laser Engineering, 2021, 50(12): 20210130
    Download Citation