• Laser & Optoelectronics Progress
  • Vol. 58, Issue 20, 2015006 (2021)
Yang Xianbin1, Dang Jianwu1、2、*, Wang Song1、2, and Wang Yangping2、3
Author Affiliations
  • 1School of Electronic and Information Engineering, Lanzhou Jiaotong University, Lanzhou, Gansu 730070, China
  • 2Gansu Provincial Engineering Research Center for Artificial Intelligence and Graphic & Image Processing, Lanzhou, Gansu 730070, China;
  • 3National Experimental Teaching Demonstration Center of Computer Science and technology, Lanzhou Jiaotong University, Lanzhou, Gansu 730070, China
  • show less

    Abstract

    Aiming at the problem that the anomaly event detection algorithms in the current complex scenes are overly dependent on frame-level labels, and the long time consumption and high memory requirement of I3D model, a M-I3D model based on I3D as a feature extractor is designed, and then an anomaly detection method based on deep spatio-temporal features and multi-instance learning is proposed. The proposed method utilizes normal and abnormal videos as packages and video clips as instances of multi-instance learning. The M-I3D model is employed to extract the features of each video clip, and the extracted feature vectors are input into three fully connected layers to automatically learn a deep anomaly sequencing model, which can predict the scores of abnormal video clips. In addition, in order to better locate anomalies in the training process, the sparse function and constraint function are introduced into the loss function. The results show that, compared with the other methods, the proposed algorithm has higher accuracy and better real-time performance on the UCF-Crime dataset.
    Copy Citation Text
    Xianbin Yang, Jianwu Dang, Song Wang, Yangping Wang. Anomaly Event Detection Based on Two-Stream Network and Multi-instance Learning[J]. Laser & Optoelectronics Progress, 2021, 58(20): 2015006
    Download Citation
    Category: Machine Vision
    Received: Nov. 25, 2020
    Accepted: Jan. 20, 2021
    Published Online: Oct. 14, 2021
    The Author Email: Dang Jianwu (dangjw@mail.lzjt.cn)