• Laser & Optoelectronics Progress
  • Vol. 59, Issue 16, 1611005 (2022)
Lü Huanhuan1、2, Zhuolu Wang1, and Hui Zhang2、*
Author Affiliations
  • 1School of Software, Liaoning Technical University, Huludao 125105, Liaoning , China
  • 2School of Information Engineering, Huzhou University, Huzhou 313000, Zhejiang , China
  • show less
    DOI: 10.3788/LOP202259.1611005 Cite this Article Set citation alerts
    Lü Huanhuan, Zhuolu Wang, Hui Zhang. Hyperspectral Image Classification Based on Edge-Preserving Filter and Deep Residual Network[J]. Laser & Optoelectronics Progress, 2022, 59(16): 1611005 Copy Citation Text show less

    Abstract

    Herein, we proposed a hyperspectral image classification method using an edge-preserving filter and deep residual network due to the characteristics of the strong correlation between hyperspectral image bands, high complexity of spectral and spatial structures, and a limited number of training samples. First, we applied joint bilateral filtering to enhance the edge structure of ground objects and extract high-quality spatial features. The extracted spatial features were fused with spectral features to obtain the spatial-spectral features. Furthermore, we constructed a two-dimensional convolutional neural network and improved the model to a deep residual network model by adding a hop layer connection in the convolutional layer. Then, the model was used to extract the deep spatial-spectral features and input them to the Softmax classifier. We compare the experiment with related state-of-the-art methods on two datasets, and the results show that the proposed method alleviates the overfitting phenomenon in the convolutional neural network classification and considers the important role of the edge structure of ground objects, which significantly improves the classification accuracy of the hyperspectral images.
    Lü Huanhuan, Zhuolu Wang, Hui Zhang. Hyperspectral Image Classification Based on Edge-Preserving Filter and Deep Residual Network[J]. Laser & Optoelectronics Progress, 2022, 59(16): 1611005
    Download Citation