• Laser & Optoelectronics Progress
  • Vol. 58, Issue 14, 1415001 (2021)
Hongzhi Du, Teng Zhang, Yanbiao Sun, Linghui Yang, and Jigui Zhu*
Author Affiliations
  • State Key Laboratory of Precision Measuring Technology and Instruments, Tianjin University, Tianjin 300072, China
  • show less
    DOI: 10.3788/LOP202158.1415001 Cite this Article Set citation alerts
    Hongzhi Du, Teng Zhang, Yanbiao Sun, Linghui Yang, Jigui Zhu. Stereo Matching Method Based on Gated Recurrent Unit Networks[J]. Laser & Optoelectronics Progress, 2021, 58(14): 1415001 Copy Citation Text show less
    Proposed network structure
    Fig. 1. Proposed network structure
    GRU structure
    Fig. 2. GRU structure
    Stacked GRU structure
    Fig. 3. Stacked GRU structure
    Result of the disparity estimation. (a) Left images; (b) stacked GRU; (c) proposed method; (d) ground truth
    Fig. 4. Result of the disparity estimation. (a) Left images; (b) stacked GRU; (c) proposed method; (d) ground truth
    Result of the disparity estimation on KITTI2015 test dataset. (a) Left image; (b) PSMNet; (c) proposed method
    Fig. 5. Result of the disparity estimation on KITTI2015 test dataset. (a) Left image; (b) PSMNet; (c) proposed method
    OperationLayer settingOutput size
    input1/4H×1/4W×64
    GRU_1K=3×3,C=321/4H×1/4W×32
    GRU_2K=3×3,C=321/4H×1/4W×32
    Conv_1K=3×3,C=48,S=21/8H×1/8W×48
    Conv_2K=3×3,C=64,S=21/16H×1/16W×64
    GRU_3K=3×3,C=641/16H×1/16W×64
    Deconv_1K=4×4,C=48,S=21/8H×1/8W×48
    add(Conv_1)K=3×3,C=48,S=11/8H×1/8W×48
    Deconv_2K=4×4,C=32,S=21/4H×1/4W×32
    add(Conv_2)K=3×3,C=32,S=11/4H×1/4W×32
    Conv_3K=3×3,C=8,S=11/4H×1/4W×8
    Conv_4K=3×3,C=1,S=11/4H×1/4W×1
    Table 1. Parameters of recurrent aggregation module
    ModuleEep /pixelR1 /%R3 /%
    Stacked GRU1.716.297.79
    Proposed method1.212.014.99
    Table 2. Comparison of different cost aggregation modules
    MethodEep /pixelMemory /GBtrun /ms
    DispNetC[3]1.681.6218.7
    PSMNet[7]1.094.65399.3
    GANet[8]0.846.652251.1
    Proposed method1.222.57326.8
    Table 3. Performance evaluation of different methods on Scene Flow test dataset
    MethodAllNoc
    ED1-bg /%ED1-fg /%ED1-all /%ED1-bg /%ED1-fg /%ED1-all /%
    DispNetC[3]4.324.414.344.113.724.05
    MADNet[18]3.759.204.663.458.414.27
    CRL[4]2.483.592.672.323.122.45
    FADNet[5]2.683.502.822.493.072.59
    GC-Net[6]2.216.162.872.025.582.61
    PSMNet[7]1.864.622.321.714.312.14
    Proposed method2.204.852.641.824.092.20
    Table 4. Evaluation of different method on KITTI2015 test dataset
    Method>2 pixel>3 pixel>4 pixel>5 pixelAvg.error
    NocAllNocAllNocAllNocAllNocAll
    DispNetC7.388.114.114.652.773.202.052.390.91.0
    FADNet3.984.632.422.861.732.061.341.620.60.7
    GC-Net2.713.461.772.301.361.771.121.460.60.7
    PSMNet2.443.011.491.891.121.420.901.150.50.6
    GANet1.892.501.191.600.911.230.761.020.40.5
    Proposed method2.393.031.481.911.101.430.871.140.50.5
    Table 5. Evaluation of different methods on KITTI2012 test dataset unit: %
    Hongzhi Du, Teng Zhang, Yanbiao Sun, Linghui Yang, Jigui Zhu. Stereo Matching Method Based on Gated Recurrent Unit Networks[J]. Laser & Optoelectronics Progress, 2021, 58(14): 1415001
    Download Citation