• Laser & Optoelectronics Progress
  • Vol. 57, Issue 18, 181509 (2020)
Ze Zhu1, Qingbing Sang1、2、*, and Hao Zhang1
Author Affiliations
  • 1School of Internet of Things Engineering, Jiangnan University, Wuxi, Jiangsu 214122, China
  • 2Jiangsu Provincial Engineering Laboratory of Pattern Recognition and Computational Intelligence, Wuxi, Jiangsu 214122, China
  • show less
    DOI: 10.3788/LOP57.181509 Cite this Article Set citation alerts
    Ze Zhu, Qingbing Sang, Hao Zhang. No Reference Video Quality Assessment Based on Spatio-Temporal Features and Attention Mechanism[J]. Laser & Optoelectronics Progress, 2020, 57(18): 181509 Copy Citation Text show less

    Abstract

    With the rapid development of video technology, more and more video applications gradually enter people's lives, Therefore, conducting research on video quality is very meaningful. Herein, a no-reference video quality assessment algorithm based on the powerful feature-extraction capabilities of convolutional neural networks and recurrent neural networks combined with the attention mechanism is proposed. This algorithm first extracts the spatial features of the distorted videos by using the Visual Geometry Group (VGG) network, the distortion of video airspace feature extraction. Further, we use cycle time-domain features of neural networks to extract the video distortion. Then the introduced attention mechanism important degree for the space-time characteristics of the video is calculated according to the important degree of the overall characteristics of the video. Finally, regression of the entire connection layer is performed to obtain the evaluation score of the video quality. Experiment results on three public video databases show that the predicted results are in good agreement with human subjective quality scores and have better performance than the latest video quality evaluation algorithms.
    Ze Zhu, Qingbing Sang, Hao Zhang. No Reference Video Quality Assessment Based on Spatio-Temporal Features and Attention Mechanism[J]. Laser & Optoelectronics Progress, 2020, 57(18): 181509
    Download Citation