• Acta Optica Sinica
  • Vol. 39, Issue 11, 1110003 (2019)
XiaoLing Zhou1、2 and Zetao Jiang1、*
Author Affiliations
  • 1Guangxi Key Laboratory of Image and Graphic Intelligent Processing, Guilin University of Electronic Technology, Guilin, Guangxi 541004, China
  • 2College of Electronic Information and Automation, Guilin University of Aerospace Technology, Guilin, Guangxi 541004, China
  • show less
    DOI: 10.3788/AOS201939.1110003 Cite this Article Set citation alerts
    XiaoLing Zhou, Zetao Jiang. Infrared and Visible Image Fusion Combining Pulse-Coupled Neural Network and Guided Filtering[J]. Acta Optica Sinica, 2019, 39(11): 1110003 Copy Citation Text show less

    Abstract

    This study proposes a novel fusion method that combines a pulse-coupled neural network (PCNN) and guided filtering to solve the issues of lacking details and virtual shadows in infrared and visible images. First, to eliminate the additional noise caused by the multi-impulse of a pixel in firing time matrix T, the traditional PCNN model is improved by simplifying its structure and adding restraint items into the pulse generating unit. Second, using original images as input, guided filtering is utilized to improve T with more edge details and salient information. Finally, based on the modified T, the weight fusion rule is adopted to obtain the fusion image. Following the firing mechanism analysis of the PCNN model, a new parameter setting method combining constraints is proposed to reduce the model's parameter setting complexity. Experimental results show that the proposed method provides efficient, satisfactory, and well-detailed fusion results, and obvious virtual shadows scarcely appear in the fusion image. Additionally, the cross entropy and space frequency indexes of the results are superior to those of other current fusion methods.
    XiaoLing Zhou, Zetao Jiang. Infrared and Visible Image Fusion Combining Pulse-Coupled Neural Network and Guided Filtering[J]. Acta Optica Sinica, 2019, 39(11): 1110003
    Download Citation