• Laser & Optoelectronics Progress
  • Vol. 58, Issue 14, 1415001 (2021)
Hongzhi Du, Teng Zhang, Yanbiao Sun, Linghui Yang, and Jigui Zhu*
Author Affiliations
  • State Key Laboratory of Precision Measuring Technology and Instruments, Tianjin University, Tianjin 300072, China
  • show less
    DOI: 10.3788/LOP202158.1415001 Cite this Article Set citation alerts
    Hongzhi Du, Teng Zhang, Yanbiao Sun, Linghui Yang, Jigui Zhu. Stereo Matching Method Based on Gated Recurrent Unit Networks[J]. Laser & Optoelectronics Progress, 2021, 58(14): 1415001 Copy Citation Text show less

    Abstract

    Deep learning stereo matching method based on three-dimensional convolutional neural networks (3DCNN) is fundamental to obtain accurate disparity results. The main concern with this approach is the high demand of computational resources for achieving high accuracy. To perform stereo matching method at a low computational cost, a method based on a gated recurrent unit network is proposed herein. The proposed method performs cost aggregation by replacing the 3D convolution with a gated-loop unit structure and reduces the computational resource requirements of the network based on the characteristics of the loop structure. To ensure high disparity estimation accuracy in images with weak textures and occluded areas, the proposed method includes an encoder-decoder architecture to further enlarge the receptive field in the 3D matching cost space and effectively aggregate contextual information of multiscale matching costs using residual connections. The proposed method was evaluated on the KITTI2015 and Scene Flow datasets. Experimental results demonstrate that the accuracy of the proposed stereo matching method is close to that of 3D convolutional stereo matching method while reducing the video memory consumption by 45% and the running time by 18%, greatly alleviating the calculation burden of stereo matching.
    Hongzhi Du, Teng Zhang, Yanbiao Sun, Linghui Yang, Jigui Zhu. Stereo Matching Method Based on Gated Recurrent Unit Networks[J]. Laser & Optoelectronics Progress, 2021, 58(14): 1415001
    Download Citation