• Acta Optica Sinica
  • Vol. 37, Issue 12, 1215003 (2017)
Shouchuan Wu1, Haitao Zhao1、*, and Shaoyuan Sun2
Author Affiliations
  • 1 School of Information Science and Engineering, East China University of Science and Technology, Shanghai 200237, China
  • 2 School of Information Science and Technology, Donghua University, Shanghai 201620, China
  • show less
    DOI: 10.3788/AOS201737.1215003 Cite this Article Set citation alerts
    Shouchuan Wu, Haitao Zhao, Shaoyuan Sun. Depth Estimation from Monocular Infrared Video Based on Bi-Recursive Convolutional Neural Network[J]. Acta Optica Sinica, 2017, 37(12): 1215003 Copy Citation Text show less

    Abstract

    For depth estimation from monocular infrared video, a method based on bi-recursive convolutional neural network (BrCNN) is proposed considering the uniqueness of a single frame and the continuity of the entire infrared video. BrCNN introduces the sequence information transfer mechanism of recurrent neural network (RNN) on the basis of the single frame feature extracted by the convolutional neural network (CNN). Thus, BrCNN possesses the feature extraction ability of CNN for a single image, which can automatically extract the local features of each frame in the infrared video, and the sequence information extraction ability of RNN, which can automatically extract the sequence information contained in each frame of the infrared video and recursively transfer this information. By introducing the bi-recursive sequence information transfer mechanism to estimate the depth of monocular infrared video, features extracted from each image containing the context information. The experimental results show that BrCNN can extract more expressive features and estimate the depth from the infrared video more precisely than the traditional CNN, which estimate the depth by extracting the feature of a single frame.
    Shouchuan Wu, Haitao Zhao, Shaoyuan Sun. Depth Estimation from Monocular Infrared Video Based on Bi-Recursive Convolutional Neural Network[J]. Acta Optica Sinica, 2017, 37(12): 1215003
    Download Citation