• Acta Optica Sinica
  • Vol. 37, Issue 11, 1110004 (2017)
Xianhong Liu and Zhibin Chen*
Author Affiliations
  • Mechanical Institute of Technology, Mechanical Engineering College, Shijiazhuang, Hebei 050003, China
  • show less
    DOI: 10.3788/AOS201737.1110004 Cite this Article Set citation alerts
    Xianhong Liu, Zhibin Chen. Fusion of Infrared and Visible Images Based on Multi-Scale Directional Guided Filter and Convolutional Sparse Representation[J]. Acta Optica Sinica, 2017, 37(11): 1110004 Copy Citation Text show less

    Abstract

    A new multi-scale directional guided filter image fusion method based on guided filter and nonsubsampled directional filter bank is proposed. The proposed method possesses the feature of edge preserving and extracting ability of directional information, and can capture the useful information from the source images more effectively. The low-frequency subbands, which are obtained by the multi-scale directional guided filter, include the low-frequency approximation components and strong edge components. These components are separated by Gaussian filter. The low-frequency approximation components and strong edge components are fused based on convolutional sparse representation and adaptive regional energy, respectively. The detail directional subbands are fused via a strategy combined saliency and guided filter to preserve the spatial consistency. Experimental results demonstrate that the proposed method could effectively extract the target feature information and preserve the background information of the source images. The fused results have better subjective visual effect and objective evaluation criteria.
    Xianhong Liu, Zhibin Chen. Fusion of Infrared and Visible Images Based on Multi-Scale Directional Guided Filter and Convolutional Sparse Representation[J]. Acta Optica Sinica, 2017, 37(11): 1110004
    Download Citation