[2] Sui C, Tian Y, Xu Y, et al. Unsupervised band selection by integrating the overall accuracy and redundancy[J]. IEEE Geoscience & Remote Sensing Letters, 2015, 12(1): 185-189.
[3] Chang C I. Hyperspectral Data Processing: Algorithm Design and Analysis[M]. Hoboken, NJ: Wiley-Interscience, 2013.
[4] Zhang Aiwu, Du Nan, Kang Xiaoyan, et al. Hyperspectral adaptive band selection method through nonlinear transform and information adjacency correlation[J]. Infrared and Laser Engineering, 2017, 46(5): 0538001. (in Chinese)
[5] Gu Y, Zhang Y. Unsupervised subspace linear spectral mixture analysis for hyperspectral images[C]// International Conference on Image Processing, 2003. ICIP 2003. Proceedings. IEEE, 2003, 1: 801-804.
[6] Wasserstein R L, Lazar N A. The ASA's statement on p-values: context, process, and purpose[J]. American Statistician, 2016, 70(2): 129-133.
[7] Nuzzo R. Statistical errors[J]. Nature, 2014, 506(2): 150-152.
[8] Press W H, Teukolsky S A, Vetterling W T, et al. Numerical Recipes in C: The Art of Scientific Computing [M]. 2nd ed. Cambridge: Cambridge University Press, 1992.
[9] Jiao Licheng, Zhao Jin, Yang Shuyuan, et al. Research advances on sparse cognitive learning, computing and recognition[J]. Chinese Journal of Computers, 2016, 39(4): 835-852. (in Chinese)
[10] Liu Chunhong, Zhao Chunhui, Zhang Lingyan. A new method of hyperspectral remote sensing image dimensional reduction[J]. Journal of Image and Graphics, 2005, 10(2): 218-222. (in Chinese)
[11] He X, Cai D, Niyogi P. Laplacian score for feature selection[C]//International Conference on Neural Information Processing Systems, 2005: 507-514.
[12] Roffo G, Melzi S, Cristani M. Infinite Feature Selection[C]//IEEE International Conference on Computer Vision. IEEE Computer Society, 2015: 4202-4210.
[13] Roffo G, Melzi S. Ranking to Learn: Feature Ranking and Selection via Eigenvector Centrality[M]//New Frontiers in Mining Complex Patterns. Berlin Heidelberg: Springer, 2017: 19-35.