• Laser & Optoelectronics Progress
  • Vol. 58, Issue 24, 2410007 (2021)
Mengmeng Ye, Jinbin Hu, Xuejin Wang, and Feng Shao*
Author Affiliations
  • Faculty of Information Science and Engineering, Ningbo University, Ningbo, Zhejiang 315211, China
  • show less
    DOI: 10.3788/LOP202158.2410007 Cite this Article Set citation alerts
    Mengmeng Ye, Jinbin Hu, Xuejin Wang, Feng Shao. No-Reference Stereoscopic Image Quality Assessment Based on Binocular Neuron Response[J]. Laser & Optoelectronics Progress, 2021, 58(24): 2410007 Copy Citation Text show less

    Abstract

    In order to solve the problem of quality prediction deviation of multiply-distorted images, a method for no-reference stereoscopic image quality assessment is proposed according to the process of visual information processed by neurons in human primary visual cortex (V1) in the research of visual physiology and psychology. Firstly, Gabor filtering is performed on the distorted stereoscopic image pairs to construct a simulated stimulus model of the V1 layer based on the binocular neuron response. Second, with the discrete cosine transformation (DCT) and the mean subtracted contrast normalization (MSCN), the natural scene statistics features of those distorted stereoscopic image pairs in DCT domain and spatial domain are extracted, respectively. Finally, the support vector regression (SVR) is adopted to build the objective evaluation model for predicting stereoscopic image quality via establishing the mapping relationship between the extracted features and the subjective scores. The proposed model is verified and compared based on the public databases, and the results show that the proposed method can uniformly predict the perceptual quality of singly-distorted and multiply-distorted stereoscopic images with better performance than that of other existing evaluation methods.
    Mengmeng Ye, Jinbin Hu, Xuejin Wang, Feng Shao. No-Reference Stereoscopic Image Quality Assessment Based on Binocular Neuron Response[J]. Laser & Optoelectronics Progress, 2021, 58(24): 2410007
    Download Citation