• Laser & Optoelectronics Progress
  • Vol. 59, Issue 10, 1010009 (2022)
Sen Xiang1、2、*, Nanting Huang1、2, Huiping Deng1、2, and Jin Wu1、2
Author Affiliations
  • 1School of Information Science and Engineering, Wuhan University of Science and Technology, Wuhan 430081, Hubei , China
  • 2Engineering Research Center for Metallurgical Automation and Measurement Technology, Ministry of Education, Wuhan University of Science and Technology, Wuhan 430081, Hubei , China
  • show less
    DOI: 10.3788/LOP202259.1010009 Cite this Article Set citation alerts
    Sen Xiang, Nanting Huang, Huiping Deng, Jin Wu. Estimation of Light Field Depth Based on Multi-Level Network Optimization[J]. Laser & Optoelectronics Progress, 2022, 59(10): 1010009 Copy Citation Text show less

    Abstract

    This study proposes a depth estimation method based on progressive optimization of a multistage neural network to accurately and robustly estimate the depth of light field. A four-level depth neural network is used to extract features from sub-aperture images in horizontal, vertical, diagonal, and anti-diagonal directions and estimate the depth map of the central viewpoint. In each subnetwork, the encoder-decoder structure having a jump connection is used to extract global and local features. The structure and training strategy of gradual optimization are adopted among subnetworks at all levels, i.e., the depth map generated by the former subnetwork is used as the input of the latter subnetwork to guide its depth estimation. The experimental results demonstrate that the proposed method can generate a high-quality scene depth map, particularly at the object boundary. Moreover, the proposed method has good robustness to input images having different resolutions. It has the advantage of efficient reasoning depth value, which can meet practical application requirements better.
    Sen Xiang, Nanting Huang, Huiping Deng, Jin Wu. Estimation of Light Field Depth Based on Multi-Level Network Optimization[J]. Laser & Optoelectronics Progress, 2022, 59(10): 1010009
    Download Citation