• Photonics Research
  • Vol. 9, Issue 6, 1084 (2021)
Shijie Feng1、2、3、*, Chao Zuo1、2、4、*, Liang Zhang1、2, Wei Yin1、2, and Qian Chen2、5、*
Author Affiliations
  • 1Smart Computational Imaging (SCI) Laboratory, Nanjing University of Science and Technology, Nanjing 210094, China
  • 2Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense, Nanjing University of Science and Technology, Nanjing 210094, China
  • 3e-mail: shijiefeng@njust.edu.cn
  • 4e-mail: zuochao@njust.edu.cn
  • 5e-mail: chenqian@njust.edu.cn
  • show less
    DOI: 10.1364/PRJ.420944 Cite this Article Set citation alerts
    Shijie Feng, Chao Zuo, Liang Zhang, Wei Yin, Qian Chen. Generalized framework for non-sinusoidal fringe analysis using deep learning[J]. Photonics Research, 2021, 9(6): 1084 Copy Citation Text show less

    Abstract

    Phase retrieval from fringe images is essential to many optical metrology applications. In the field of fringe projection profilometry, the phase is often obtained with systematic errors if the fringe pattern is not a perfect sinusoid. Several factors can account for non-sinusoidal fringe patterns, such as the non-linear input–output response (e.g., the gamma effect) of digital projectors, the residual harmonics in binary defocusing projection, and the image saturation due to intense reflection. Traditionally, these problems are handled separately with different well-designed methods, which can be seen as “one-to-one” strategies. Inspired by recent successful artificial intelligence-based optical imaging applications, we propose a “one-to-many” deep learning technique that can analyze non-sinusoidal fringe images resulting from different non-sinusoidal factors and even the coupling of these factors. We show for the first time, to the best of our knowledge, a trained deep neural network can effectively suppress the phase errors due to various kinds of non-sinusoidal patterns. Our work paves the way to robust and powerful learning-based fringe analysis approaches.
    In(x,y)=A(x,y)+B(x,y)cos[ϕ(x,y)δn],

    View in Article

    ϕ(x,y)=arctann=0N1In(x,y)sin(2πnN)n=0N1In(x,y)cos(2πnN).

    View in Article

    InH(x,y)=AH(x,y)+j=1pBjH(x,y)cos{j[ϕ(x,y)δn]},

    View in Article

    ΔIn(x,y)=In(x,y)InH(x,y).

    View in Article

    Δϕ(x,y)=n=0N1ϕ(x,y)In(x,y)ΔIn(x,y).

    View in Article

    Δϕ=4B2N2n=0N1{[m=0N1Imsin2π(nm)N][j=0pBjcosj(ϕ2πnN)]}.

    View in Article

    M(x,y)=n=0N1In(x,y)sin(2πnN),

    View in Article

    D(x,y)=n=0N1In(x,y)cos(2πnN).

    View in Article

    Loss(θ)=1HW[YM(θ)GM2+YD(θ)GD2],

    View in Article

    Shijie Feng, Chao Zuo, Liang Zhang, Wei Yin, Qian Chen. Generalized framework for non-sinusoidal fringe analysis using deep learning[J]. Photonics Research, 2021, 9(6): 1084
    Download Citation