• Laser & Optoelectronics Progress
  • Vol. 56, Issue 15, 151502 (2019)
Feifei Shi**, Songlong Zhang, and Li Peng*
Author Affiliations
  • Engineering Research Center of Internet of Things Technology Applications of the Ministry of Education, College of Internet of Things Engineering, Jiangnan University, Wuxi, Jiangsu 214122, China
  • show less
    DOI: 10.3788/LOP56.151502 Cite this Article Set citation alerts
    Feifei Shi, Songlong Zhang, Li Peng. Salient Object Detection Based on Deep Residual Networks and Edge Supervised Learning[J]. Laser & Optoelectronics Progress, 2019, 56(15): 151502 Copy Citation Text show less

    Abstract

    This paper proposes a saliency detection method based on deep residual networks and multiscale edge residual learning to address the problems of low salient values and blurred edges in images having complex backgrounds. Further, an edge residual block is proposed, and an edge residual network is constructed based on the deep residual network using the edge residual block for the salient graph edge supervised learning. In addition, the edge features are learned while training the network by constructing a three-category model based on the background, foreground, and edge, which can make the target edge more accurate. The output uses atrous convolutions to construct a multiscale atrous convolution unit for integrating and extracting the multiscale features based on the global information. Finally, the proposed algorithm is tested in an ablation study using two datasets (SED2 and ECSSD) and compared with various existing algorithms based on the common evaluation indicators. The experimental results demonstrate that the proposed method exhibits high accuracy and recall rate, maintains good integrity for the significant target, and distinguishes the significant target and background from the edge contour regions.
    Feifei Shi, Songlong Zhang, Li Peng. Salient Object Detection Based on Deep Residual Networks and Edge Supervised Learning[J]. Laser & Optoelectronics Progress, 2019, 56(15): 151502
    Download Citation