• Acta Optica Sinica
  • Vol. 42, Issue 18, 1828004 (2022)
Hui Gao, Xiaodong Yan, Heng Zhang*, Yiting Niu, and Jiaqi Wang
Author Affiliations
  • Institute of Geospatial Information, Strategic Support Force Information Engineering University, Zhengzhou 450001, Henan, China
  • show less
    DOI: 10.3788/AOS202242.1828004 Cite this Article Set citation alerts
    Hui Gao, Xiaodong Yan, Heng Zhang, Yiting Niu, Jiaqi Wang. Multi-Scale Sea-Land Segmentation Method for Remote Sensing Images Based on Res2Net[J]. Acta Optica Sinica, 2022, 42(18): 1828004 Copy Citation Text show less

    Abstract

    The sea-land segmentation of remote sensing images has significance application value in applications such as coastline extraction, island reef identification, and near-shore target detection. However, it is prone to inaccurate sea-land boundary segmentation owing to the influences of complex and diverse background environment and the sea-land boundary. To address the aforementioned problem, a multi-scale sea-land segmentation network, namely MSRNet, for remote sensing images based on Res2Net is proposed in this paper. It uses a deep convolutional neural network named Res2Net to extract multi-scale features of images and applies a squeeze and attention module to enhance features at different scales to strengthen the weak sea-land boundary information. The enhanced feature maps at different scales are then upsampled to the original image size, and the prediction results at different scales are loss-enhanced using a deep supervision strategy. Two remote sensing image datasets containing complex scenes are selected for the experiments, and the results show that the proposed network can obtain more accurate sea-land segmentation results as well as clearer and more complete sea-land boundaries than other networks for various natural and artificial coastlines.
    Hui Gao, Xiaodong Yan, Heng Zhang, Yiting Niu, Jiaqi Wang. Multi-Scale Sea-Land Segmentation Method for Remote Sensing Images Based on Res2Net[J]. Acta Optica Sinica, 2022, 42(18): 1828004
    Download Citation