• Acta Photonica Sinica
  • Vol. 51, Issue 4, 0410002 (2022)
Zhishe WANG1、*, Wenyu SHAO1, Fengbao YANG2, and Yanlin CHEN1
Author Affiliations
  • 1School of Applied Science,Taiyuan University of Science and Technology,Taiyuan 030024,China
  • 2School of Information and Communication Engineering,North University of China,Taiyuan 030051,China
  • show less
    DOI: 10.3788/gzxb20225104.0410002 Cite this Article
    Zhishe WANG, Wenyu SHAO, Fengbao YANG, Yanlin CHEN. Infrared and Visible Image Fusion Method via Interactive Attention-based Generative Adversarial Network[J]. Acta Photonica Sinica, 2022, 51(4): 0410002 Copy Citation Text show less
    The principle of interactive attention-based generative adversarial network
    Fig. 1. The principle of interactive attention-based generative adversarial network
    Interactive attention fusion model
    Fig. 2. Interactive attention fusion model
    The subjective comparison results of five fusion models
    Fig. 3. The subjective comparison results of five fusion models
    The subjective comparison results of different fusion methods on the TNO dataset
    Fig. 4. The subjective comparison results of different fusion methods on the TNO dataset
    The subjective comparison results of different fusion methods for FLIR_06422
    Fig. 5. The subjective comparison results of different fusion methods for FLIR_06422
    The subjective comparison results of different fusion methods for FLIR_07210
    Fig. 6. The subjective comparison results of different fusion methods for FLIR_07210
    ModelsAGSDMIPCNCIESFMS_SSIMVIF
    No_atten5.464 3329.894 711.942 380.237 180.804 2710.010 080.872 680.395 57
    Only_CA5.153 5636.361 132.657 490.308 270.806 429.620 200.892 710.409 20
    Only_SA4.649 2837.827 802.968 020.316 780.807 418.807 060.879 700.403 27
    SA_CA5.547 5737.047 942.958 200.315 430.807 4610.554 430.875 660.426 09
    Ours5.572 0436.768 193.047 120.309 840.807 9110.588 100.863 140.426 42
    Table 1. The objective comparison results of five fusion models on the TNO dataset
    MethodsAGSDMIPCNCIESFMS_SSIMVIF
    WLS5.417 7124.860 271.742 860.280 990.803 9110.793 840.909 880.378 66
    DenseFuse3.193 3122.857 692.035 890.290 280.804 516.094 430.876 960.330 90
    IFCNN5.467 5324.067 121.801 910.278 680.804 0410.131 410.904 740.389 20
    SEDRFuse3.544 1140.793 022.110 140.172 270.804 626.794 460.895 050.316 82
    U2Fusion5.615 1533.596 081.760 990.261 200.803 9210.436 550.919 820.420 78
    PMGI4.695 3634.768 162.108 010.243 110.804 828.999 780.887 960.390 59
    FusionGAN3.073 5726.820 442.166 520.102 640.805 035.982 470.734 490.248 69
    GANMcC3.139 8329.929 732.108 640.232 710.804 526.009 630.859 150.305 10
    RFN-Nest3.125 2134.853 731.928 510.233 900.804 286.012 690.912 170.355 10
    Ours5.572 0436.768 193.047 120.309 840.807 9110.588 100.863 140.426 42
    AMIR34.548%21.416%54.385%33.056%0.436%33.735%-1.670%22.384%
    Table 2. The objective comparison results of different fusion methods on the TNO dataset
    MethodsAGSDMIPCNCIESFMS_SSIMVIF
    WLS6.119 1324.220 501.475 490.244 280.803 2511.297 250.906 430.309 24
    DenseFuse3.814 3122.715 991.579 940.258 930.803 417.130 290.873 690.286 36
    IFCNN6.651 6223.563 421.500 510.237 270.803 2512.168 380.902 030.315 65
    SEDRFuse4.993 4938.476 492.059 200.252 530.804 319.803 580.934 910.316 76
    U2Fusion5.961 6134.031 641.593 960.259 710.803 4211.554 530.936 110.339 02
    PMGI5.274 5534.230 812.080 610.214 980.804 359.987 530.876 010.318 94
    FusionGAN3.115 5425.355 112.431 530.082 940.805 506.404 570.670 090.207 64
    GANMcC4.333 9033.617 441.994 600.228 220.804 048.180 660.900 370.278 32
    RFN-Nest3.568 6935.239 551.687 920.264 750.803 657.057 250.925 030.304 43
    Ours5.775 8339.153 663.252 200.299 740.807 9711.442 500.857 930.329 06
    AMIR18.593%29.815%78.433%32.005%0.505%23.208%-2.565%10.656%
    Table 3. The objective comparison results of different fusion methods on the Nato-camp sequence
    MethodsAGSDMIPCNCIESFMS_SSIMVIF
    WLS5.883 1429.056 592.361 240.342 410.806 1112.347 230.885 220.311 46
    DenseFuse3.652 5627.115 482.776 460.350 990.807 317.679 220.831 360.266 34
    IFCNN5.964 5328.818 462.483 700.334 820.806 3612.521 830.873 550.293 26
    SEDRFuse5.337 3946.910 593.017 210.280 030.807 3411.115 360.895 570.276 69
    U2Fusion6.514 4345.098 183.001 630.348 450.807 9513.724 650.912 770.315 35
    PMGI5.027 0445.020 312.681 320.304 320.808 4510.472 620.894 090.286 20
    FusionGAN3.919 0939.887 012.672 070.110 380.807 268.450 700.732 310.190 09
    GANMcC4.076 8842.787 942.805 200.280 490.806 818.451 230.862 170.243 64
    RFN-Nest3.830 4244.210 132.887 680.262 810.806 808.115 650.876 610.265 58
    Ours6.071 3739.124 913.416 260.401 350.808 9613.062 800.825 990.331 78
    AMIR23.610%0.923%24.547%38.148%0.224%26.580%-4.247%21.948%
    Table 4. The objective comparison results of different fusion methods on the Roadscene dataset
    DatasetWLSDense-FuseIFCNNSEDR-FuseU2FusionPMGIFusion-GANGAN-McCRFN-NestOurs
    TNO2.3590.0850.0461.1481.7220.5882.1844.4450.1770.151
    Sequence0.4550.0290.0151.1580.6590.1810.6781.3290.0680.059
    Roadscene1.0870.0480.0241.1370.9920.2931.1352.2710.0960.087
    Table 5. The comparison results of computation efficiency for different fusion methods(units:s)
    Zhishe WANG, Wenyu SHAO, Fengbao YANG, Yanlin CHEN. Infrared and Visible Image Fusion Method via Interactive Attention-based Generative Adversarial Network[J]. Acta Photonica Sinica, 2022, 51(4): 0410002
    Download Citation