• Laser & Optoelectronics Progress
  • Vol. 60, Issue 24, 2410012 (2023)
Shengnan Qin1、2、3 and Yanting Lu1、2、*
Author Affiliations
  • 1Nanjing Institute of Astronomical Optics & Technology, Chinese Academy of Sciences, Nanjing 210042, Jiangsu, China
  • 2CAS Key Laboratory of Astronomical Optics & Technology, Nanjing Institute of Astronomical Optics & Technology, Nanjing 210042, Jiangsu, China
  • 3University of Chinese Academy of Sciences, Beijing 100049, China
  • show less
    DOI: 10.3788/LOP230937 Cite this Article Set citation alerts
    Shengnan Qin, Yanting Lu. Curved Texture Flattening Algorithm Based on the Light Field Camera[J]. Laser & Optoelectronics Progress, 2023, 60(24): 2410012 Copy Citation Text show less

    Abstract

    For texture patterns distributed on the curved surfaces of objects, it is often necessary to extract and flatten the texture patterns from the curved surfaces for the purpose of comprehensive display and subsequent usage. Therefore, we suggest a curved texture flattening scheme based on the light field camera, especially the focused light field camera, which can provide a high-resolution texture image and corresponding depth map without additional registrations. A curved texture flattening algorithm is designed for this scheme. The algorithm divides the curved texture image into multiple overlapping local texture images, corrects the local texture distortion based on the normal vector of the fitted plane for each local texture image, and finally stitches the corrected local texture images into a completely flattened texture image. Simulated and real experiments reveal that the proposed curved texture flattening algorithm can effectively flatten a variety of texture patterns distributed on different curved surfaces, and the algorithm has certain robustness on the different image quality of texture images and the errors of the depth measurements.
    Shengnan Qin, Yanting Lu. Curved Texture Flattening Algorithm Based on the Light Field Camera[J]. Laser & Optoelectronics Progress, 2023, 60(24): 2410012
    Download Citation