• Laser & Optoelectronics Progress
  • Vol. 58, Issue 16, 1600005 (2021)
Bin Cao1, Feng Yang1, and Jingang Ma2、*
Author Affiliations
  • 1Shandong Provincial Hospital of Traditional Chinese Medicine, Jinan, Shandong 250000, China
  • 2School of Intelligence and Information Engineering, Shandong University of Traditional Chinese Medicine, Jinan, Shandong 250355, China
  • show less
    DOI: 10.3788/LOP202158.1600005 Cite this Article Set citation alerts
    Bin Cao, Feng Yang, Jingang Ma. Application of Deep Learning Methods in Diagnosis of Lung Nodules[J]. Laser & Optoelectronics Progress, 2021, 58(16): 1600005 Copy Citation Text show less

    Abstract

    Lung cancer is the malignant tumor with the highest mortality rate in the world. Its early diagnosis can remarkably improve the survival rate of lung cancer patients. Deep learning can extract the hidden layer features of medical images and can complete the classification and segmentation of medical images. The application of deep learning methods for the early diagnosis of lung nodules has become a key point of research. This article introduces several databases commonly used in the field of lung nodule diagnosis and combines the relevant literature recently published at home and abroad to classify the latest research progress and summarize and analyze the application of deep learning frameworks for lung nodule image segmentation and classification. The basic ideas of various algorithms, network architecture forms, representative improvement schemes, and a summary of advantages and disadvantages are presented. Finally, some problems encountered while using deep learning for the diagnosis of pulmonary nodules, conclusions, and the development prospects are discussed. This study is expected to provide a reference for future research applications and accelerate the maturity of research and clinical applications in the concerned field.
    Bin Cao, Feng Yang, Jingang Ma. Application of Deep Learning Methods in Diagnosis of Lung Nodules[J]. Laser & Optoelectronics Progress, 2021, 58(16): 1600005
    Download Citation