• Laser & Optoelectronics Progress
  • Vol. 59, Issue 16, 1633001 (2022)
Qibo Chen1、2, Baozhen Ge1、2、*, Yunpeng Li1、2, and Jianing Quan1、2
Author Affiliations
  • 1School of Precision Instrument and Opto-Electronics Engineering, Tianjin University, Tianjin 300072, China
  • 2Key Laboratory of Opto-Electronics Information Technology, Ministry of Education, Tianjin University, Tianjin 300072, China
  • show less
    DOI: 10.3788/LOP202259.1633001 Cite this Article Set citation alerts
    Qibo Chen, Baozhen Ge, Yunpeng Li, Jianing Quan. Stereo Matching Algorithm Based on Multi-Attention Mechanism[J]. Laser & Optoelectronics Progress, 2022, 59(16): 1633001 Copy Citation Text show less

    Abstract

    Stereo matching algorithms used currently are ineffective at matching weak textures, shadows, and other pathological regions. To make full use of scene context information to improve disparity matching accuracy, this study proposes an effective multiple attention stereo matching algorithm (MAnet). At the feature extraction stage, according to multiple attention mechanisms, such as location channel attention and multiheads crisscross attention (MCA), we adjust the feature channels and selectively aggregate contextual information in any range to provide more discriminative features for matching cost calculation. Extending MCA to 3D convolution expands the network perceptual region to accumulate more precise matching cost. The learning ability of challenging regions is enhanced for the networks’ loss function by weighting the loss outside the error threshold. The algorithms are experimentally validated on the KITTI dataset, and the error of MAnet for the KITTI2015 test set is 2.06%. The experimental findings demonstrate that compared to the benchmark algorithm, the MAnet enhances disparity accuracy and improves the matching performance in the pathological region.
    Qibo Chen, Baozhen Ge, Yunpeng Li, Jianing Quan. Stereo Matching Algorithm Based on Multi-Attention Mechanism[J]. Laser & Optoelectronics Progress, 2022, 59(16): 1633001
    Download Citation