• Infrared and Laser Engineering
  • Vol. 49, Issue 5, 20190552 (2020)
Xiaomin Pei1, Huijie Fan2, and Yandong Tang2
Author Affiliations
  • 1School of Information and Control Engineering, Liaoning Shihua University, Fushun 113001, China
  • 2State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016, China
  • show less
    DOI: 10.3788/IRLA20190552 Cite this Article
    Xiaomin Pei, Huijie Fan, Yandong Tang. Two-person interaction recognition based on multi-stream spatio-temporal fusion network[J]. Infrared and Laser Engineering, 2020, 49(5): 20190552 Copy Citation Text show less

    Abstract

    Two-person interaction recognition based on multi-stream spatio-temporal fusion was proposed. Firstly, a method to describe two-person’s skeleton which invariable with angle of view was proposed. Then a two-layer spatio-temporal fusion network model was designed. In the first layer, the spatial correlation features were obtained based on one-dimensional convolutional neural network (1DCNN) and bi-directional long short term memory(BiLSTM). In the second layer, the spatio-temporal fusion features were obtained based on LSTM. Finally, the multi-stream spatio-temporal fusion network was used to obtain the multi-stream fusion features, which learned one kind of feature by one stream and fusion features for all streams together at last. The weights for each stream was shared, and every stream had the same structure. After features were fusion for all streams, it could be used for interaction recognition. By applying this algorithm to NTU-rgbd datasets, the accuracy for two person interaction recognition for cross-subject could reach 96.42%, and the accuracy of two person interaction recognition for cross-view could reach 97.46%. Compared with the state of art methods in this field, this method performed best in two person interaction recognition.
    Xiaomin Pei, Huijie Fan, Yandong Tang. Two-person interaction recognition based on multi-stream spatio-temporal fusion network[J]. Infrared and Laser Engineering, 2020, 49(5): 20190552
    Download Citation