• Laser & Optoelectronics Progress
  • Vol. 59, Issue 16, 1617002 (2022)
Yuanzhi Xie1, Shiju Yan1、*, Gaofeng Wei2, and Linying Yang1
Author Affiliations
  • 1School of Medical Instrument and Food Engineering, University of Shanghai for Science and Technology, Shanghai 200093, China
  • 2Institute of Tropical Medicine, Naval Medical University, Shanghai 200025, China
  • show less
    DOI: 10.3788/LOP202259.1617002 Cite this Article Set citation alerts
    Yuanzhi Xie, Shiju Yan, Gaofeng Wei, Linying Yang. Breast Mass Segmentation Based on U-Net++ and Adversarial Learning Network[J]. Laser & Optoelectronics Progress, 2022, 59(16): 1617002 Copy Citation Text show less

    Abstract

    In this paper, an accurate and reliable breast lesion segmentation algorithm is examined to extract tumor regions from mammographic images for the diagnosis of breast diseases. Additionally, a framework incorporating an adversarial network, which is mainly composed of a segmentation network and a discriminant network, is used for the enhancement of the high-order consistency of the segmentation results. Here, an improved U-Net++ network is used as the segmentation network to generate a breast mass segmentation map (a mask), while the discriminant network is used to discriminate between the generated mask and the real mask to further enhance the performance of the segmentation network. The performance of the proposed method is verified on the public dataset (CBIS-DDSM). The experimental results show that the specificity, sensitivity, accuracy, and Dice coefficient of the proposed method are 99.7%, 90.4%, 98%, and 91%, respectively, which are higher than that of the classical algorithms. The deep learning algorithm combined with the improved model (U-Net++) and generated countermeasure network can improve the segmentation performance of breast mass in molybdenum target images.
    Yuanzhi Xie, Shiju Yan, Gaofeng Wei, Linying Yang. Breast Mass Segmentation Based on U-Net++ and Adversarial Learning Network[J]. Laser & Optoelectronics Progress, 2022, 59(16): 1617002
    Download Citation