• Laser & Optoelectronics Progress
  • Vol. 59, Issue 16, 1615003 (2022)
Yafeng Zhang, Cuixiang Liu*, Jie Ma, and Yating Su
Author Affiliations
  • School of Electronic and Information Engineering, Hebei University of Technology, Tianjin 300401, China
  • show less
    DOI: 10.3788/LOP202259.1615003 Cite this Article Set citation alerts
    Yafeng Zhang, Cuixiang Liu, Jie Ma, Yating Su. Three-dimensional Human Pose Reconstruction Based on Multifeature Point Matching[J]. Laser & Optoelectronics Progress, 2022, 59(16): 1615003 Copy Citation Text show less

    Abstract

    When reconstructing three-dimensional human posture from a single view image, the lack of depth information and postural diversity cause orientation errors and the poor processing of the postural details when mapping two-dimensional to three-dimensional posture. Therefore, combined with the distribution law of bones, vertices of the stereo human model, and optimization model deformation strategy, we propose a three-dimensional pose reconstruction method using multifeature point matching. The core of the proposed method is to match and fit multiple two-dimensional human feature points with the three-dimensional feature points of the human model by optimizing the energy function to realize the reconstruction of three-dimensional posture. Furthermore, we establish orientation constraints using some joint points to reduce the impact of the lack of depth information on the reconstructed posture. Additionally, multiple head feature points are used to adjust the head pose to reduce the impact of pose diversity on the head pose reconstruction. The experimental results on the public pose datasets, MPI-INF-3DHP and LSP, show that the proposed method effectively solves the problems of pose blur and poorly detailed pose processing, as well as accurately reconstructs the three-dimensional human pose under common actions.
    Yafeng Zhang, Cuixiang Liu, Jie Ma, Yating Su. Three-dimensional Human Pose Reconstruction Based on Multifeature Point Matching[J]. Laser & Optoelectronics Progress, 2022, 59(16): 1615003
    Download Citation