• Laser & Optoelectronics Progress
  • Vol. 60, Issue 2, 0210008 (2023)
Feiyan Yang1、2 and Meng Wang1、2、*
Author Affiliations
  • 1Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650500, Yunan, China
  • 2Key Laboratory of Artificial Intelligence in Yunnan Province, Kunming 650500, Yunan, China
  • show less
    DOI: 10.3788/LOP212808 Cite this Article Set citation alerts
    Feiyan Yang, Meng Wang. Infrared and Visible Image Fusion Based on Structure-Texture Decomposition and VGG Deep Networks[J]. Laser & Optoelectronics Progress, 2023, 60(2): 0210008 Copy Citation Text show less

    Abstract

    To address the problems of underutilization of low-frequency information and easy mixing of high-frequency details with noise in the current infrared and visible image fusion methods, an infrared and visible image fusion method based on structure-texture (ST) decomposition and VGG deep network is proposed. First, the input image is decomposed into high-low frequency subbands using mean filtering, and ST is introduced to re-decompose the low-frequency subbands. The structure and texture are pre-fused by absolute maximum and neighborhood spatial frequency, respectively. Subsequently, the input image is input into the VGG network to get the multi-layer feature maping, and the Sigmiod function is used to realize the normalized prefusion of the high-frequency subband. Finally, the pre-fused high-frequency, low-frequency structure, and low-frequency texture are used for image fusion and reconstruction. Experimental results show that the proposed algorithm can fuse the deep detail features of images, retain texture details, and suppress noise effectively, and has significant advantages in noise assessment, structural similarity index measure, mean square error, peak signal to noise ratio, and other objective indexes.
    Feiyan Yang, Meng Wang. Infrared and Visible Image Fusion Based on Structure-Texture Decomposition and VGG Deep Networks[J]. Laser & Optoelectronics Progress, 2023, 60(2): 0210008
    Download Citation