• Laser & Optoelectronics Progress
  • Vol. 59, Issue 6, 0617017 (2022)
Lingxiao Wang1, Jun Yang1, Wensai Wang1, and Ting Li1、2、*
Author Affiliations
  • 1Institute of Biomedical Engineering, Chinese Academy of Medical Sciences, Tianjin 300192, China
  • 2Chinese Institute for Brain Research, Beijing, Beijing 102206, China
  • show less
    DOI: 10.3788/LOP202259.0617017 Cite this Article Set citation alerts
    Lingxiao Wang, Jun Yang, Wensai Wang, Ting Li. Automatic Detection of Retinal Diseases Based on Lightweight Convolutional Neural Network[J]. Laser & Optoelectronics Progress, 2022, 59(6): 0617017 Copy Citation Text show less

    Abstract

    One major method for detecting retinopathy in clinics is optical coherence tomography. However, this manual diagnostic model is affected by strong subjectivity and low efficiency. Therefore, this paper proposes a lightweight convolutional neural network for the automatic detection of retinopathy. The proposed network consists of two modules. The first module combines atrous convolutions and depth wise separable convolutions to reduce the number of parameters; the second module uses the decomposition convolution method to extend the depth by decomposing the conventional convolution layer into multilayer asymmetric convolution. Both modules are combined to form a feature extractor, and the Softmax function is used as the classifier to obtain a lightweight model with 44 layers deep and 9.2 MB parameters. The accuracy, sensitivity, specificity, and area under the receiver operating characteristic curve of the proposed network on the test set are 0.980, 0.954, 0.987, and 0.997, respectively. The visualization results show that the diagnostic basis of the model is consistent with that of ophthalmologists. These results show that the proposed network can accurately automate retinal disease detection.
    Lingxiao Wang, Jun Yang, Wensai Wang, Ting Li. Automatic Detection of Retinal Diseases Based on Lightweight Convolutional Neural Network[J]. Laser & Optoelectronics Progress, 2022, 59(6): 0617017
    Download Citation