• Laser & Optoelectronics Progress
  • Vol. 57, Issue 24, 241003 (2020)
Deyong Gao1、2, Zibing Kang1、*, Song Wang1、2, and Yangping Wang1、3
Author Affiliations
  • 1School of Electronic & Information Engineering, Lanzhou Jiaotong University, Lanzhou, Gansu 730070, China;
  • 2Gansu Provincial Engineering Research Center for Artificial Intelligence and Graphic & Image Processing, Lanzhou, Gansu 730070, China;
  • 3Gansu Provincial Key Laboratory of System Dynamics and Reliability of Rail Transport Equipment, Lanzhou, Gansu 730070, China
  • show less
    DOI: 10.3788/LOP57.241003 Cite this Article Set citation alerts
    Deyong Gao, Zibing Kang, Song Wang, Yangping Wang. Human-Body Action Recognition Based on Dense Trajectories and Video Saliency[J]. Laser & Optoelectronics Progress, 2020, 57(24): 241003 Copy Citation Text show less

    Abstract

    The traditional dense trajectory algorithm has achieved great success in human-body action recognition. However, the trajectories of the action and background motions are processed equally during algorithm's formation, which leads to redundant video representation and limited recognition accuracy. In this paper, the patterns of the background and behavioral motions are compared, a sparse error matrix is obtained using low-rank matrix decomposition on the basis of the sparse coefficient matrix of the feature dictionary, and a saliency map is solved. The saliency map is then used as the base for representing human-body action in only the action-related areas. The validity of this method is confirmed based on the open datasets UCF Sports and YouTube.
    Deyong Gao, Zibing Kang, Song Wang, Yangping Wang. Human-Body Action Recognition Based on Dense Trajectories and Video Saliency[J]. Laser & Optoelectronics Progress, 2020, 57(24): 241003
    Download Citation