• Infrared and Laser Engineering
  • Vol. 49, Issue 5, 20200015 (2020)
Lin Sen1、2、3、*, Liu Shiben1, and Tang Yandong2、3
Author Affiliations
  • 1[in Chinese]
  • 2[in Chinese]
  • 3[in Chinese]
  • show less
    DOI: 10.3788/irla20200015 Cite this Article
    Lin Sen, Liu Shiben, Tang Yandong. Multi-input fusion adversarial network for underwater image enhancement[J]. Infrared and Laser Engineering, 2020, 49(5): 20200015 Copy Citation Text show less

    Abstract

    For underwater image of low contrast, color deviation and blurred details and other issues, the multi-input fusion adversarial networks was proposed to enhance underwater images. The main feature of this method was that the generative network used encoding and decoding structure, filtering noise through convolution layer, recovering lost details through deconvolution layer and refining the image pixel by pixel. Firstly, the original image was preprocessed to obtain two types of images: color correction and contrast enhancement. Secondly, the confidence graph of the difference between the two enhanced images and the original image was learned by using the generated network. Then, in order to reduce artifacts and details blur introduced by the two enhancement algorithms in the process of generating network learning, the texture extraction unit was added to extract texture features from the two enhanced images, and the extracted texture features were fused with the corresponding confidence map. Finally, the enhanced underwater image was obtained by constructing multiple loss functions and training the adversarial network repeatedly. The experimental results show that the enhanced underwater image has bright color and improved contrast, the average value of UCIQE and NIQE is 0.639 9 and 3.727 3 respectively. Compared with other algorithms, the algorithm has significant advantages and proves its good effect.
    Lin Sen, Liu Shiben, Tang Yandong. Multi-input fusion adversarial network for underwater image enhancement[J]. Infrared and Laser Engineering, 2020, 49(5): 20200015
    Download Citation