• Laser & Optoelectronics Progress
  • Vol. 59, Issue 8, 0815006 (2022)
Yao Chen1, Yunwei Zhang1、2、3、*, Jinhui Lei1、3, and li Li1
Author Affiliations
  • 1Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming , Yunnan 650500, China
  • 2Yunnan Key Laboratory of Artificial Intelligence, Kunming University of Science and Technology, Kunming , Yunnan 650500, China
  • 3Yunnan Key Laboratory of Computer Technology Application, Kunming University of Science and Technology, Kunming , Yunnan 650500, China
  • show less
    DOI: 10.3788/LOP202259.0815006 Cite this Article Set citation alerts
    Yao Chen, Yunwei Zhang, Jinhui Lei, li Li. Automatic Extraction Method for Gait Parameters of Quadruped Walking Based on Computer Vision[J]. Laser & Optoelectronics Progress, 2022, 59(8): 0815006 Copy Citation Text show less

    Abstract

    The automatic recognition of motion features of quadruped working could be widely used in animal bionics,behavior recognition, disease prediction, and individual identification recognition. In this paper, based on the computer vision technology and deep learning method, an automatic extraction method for the gait parameters of quadruped walking is established. At first, the walking image frames of quadrupeds can be obtained from the collected quadruped walking videos by using the video frame decomposition technology. Next, the moving object can be extracted via the improved semantic segmentation model DeeplabV3+. Then, according to the characteristic analysis of quadruped walking gait, the detection and matching of the motion corners can be realized based on the distance from the center point to the contour of the object. Finally, a method based on the distance from four limb motion corners to a fixed reference point is established to effectively solve the problem of quadruped motion feature parameter extraction. The experimental results show that the proposed method can give better results for the motion corner detection of quadrupeds. The maximum error is 32, 27, and 19 pixel in the motion corner detection for rhino, buffalo, and alpaca, respectively. The corners of four limbs match accurately. The results also show that the calculation error of the gait cycle and gait frequency is less than 2%, the gait sequence output is correct, and the maximum calculation error of the stride length is 2.85%.
    Yao Chen, Yunwei Zhang, Jinhui Lei, li Li. Automatic Extraction Method for Gait Parameters of Quadruped Walking Based on Computer Vision[J]. Laser & Optoelectronics Progress, 2022, 59(8): 0815006
    Download Citation