• Photonic Sensors
  • Vol. 8, Issue 2, 134 (2018)
Tiezhu QIAO1, Lulu CHEN1、*, Yusong PANG2, and Gaowei YAN3
Author Affiliations
  • 1Key Lab of Advanced Transducers and Intelligent Control System, Ministry of Education and Shanxi Province, Taiyuan University of Technology, Taiyuan, 030024, China
  • 2Section of Transport Engineering and Logistics, Faculty of 3mE, Delft University of Technology, Mekelweg 2, 2628CD, Netherlands
  • 3College of Information Engineering, Taiyuan University of Technology, Taiyuan, 030024, China
  • show less
    DOI: 10.1007/s13320-018-0401-4 Cite this Article
    Tiezhu QIAO, Lulu CHEN, Yusong PANG, Gaowei YAN. Integrative Multi-Spectral Sensor Device for Far-Infrared and Visible Light Fusion[J]. Photonic Sensors, 2018, 8(2): 134 Copy Citation Text show less
    References

    [1] Y. L. Maoult, T. Sentenac, J. J. Orteu, and J. P. Arcens, “Fire detection: a new approach based on a low cost CCD camera in the near infrared,” Process Safety & Environmental Protection, 2007, 85(3): 193-206.

    [2] B. C. Ko, K. H. Cheong, and J. Y. Nam, “Fire detection based on vision sensor and support vector machines,” Fire Safety Journal, 2009, 44(3): 322-329.

    [3] H. T. Chen, Y. C. Wu, and C. C Hsu, “Daytime preceding vehicle brake light detection using monocular vision,” IEEE Sensors Journal, 2015, 16(1): 120-131.

    [4] Y. Li, Y. L. Qiao, and Y. Ruichek, “Multiframe-based high dynamic range monocular vision system for advanced driver assistance systems,” IEEE Sensors Journal, 2015, 15(10): 5433-5441.

    [5] V. Milanés, D. F. Llorca, J. Villagrá, J. Pérez, C. Fernandez, I. Parra, et al., “Intelligent automatic overtaking system using vision for vehicle detection,” Expert Systems with Applications, 2012, 39(3): 3362-3373.

    [6] B. Z. Jia, R. Liu, and M. Zhu, “Real-time obstacle detection with motion features using monocular vision,” Visual Computer, 2015, 31(3): 281-293.

    [7] S. C. Yi, Y. C. Chen, and C. H. Chang, “A lane detection approach based on intelligent vision,” Computers & Electrical Engineering, 2015, 42(C): 23-29.

    [8] Y. S. Lee, Y. M. Chan, and L. C. Fu, “Near-infrared-based nighttime pedestrian detection using grouped part models,” IEEE Transactions on Intelligent Transportation Systems, 2015, 16(4): 1929-1940.

    [9] R. O′Malley, E. Jones, and M. Glavin, “Detection of pedestrians in far-infrared automotive night vision using region-growing and clothing distortion compensation,” Infrared Physics & Technology, 2010, 53(6): 439-449.

    [10] C. J. Liu, Y. Zhang, K. K. Tan, and H. Y. Yang, “Sensor fusion method for horizon detection from an aircraft in low visibility conditions,” IEEE Transactions on Instrumentation and Measurement, 2014, 63(3): 620-627.

    [11] Y. Chen, L. Wang, Z. B. Sun, Y. D. Jiang, and G. J. Zhai, “Fusion of color microscopic images based on bidimensional empirical mode decomposition,” Optics Express, 2010, 18(21): 21757-21769.

    [12] J. F. Zhao, Q. Zhou, Y. T. Chen, H. J. Feng, Z. H. Xu, and Q. Li, “Fusion of visible and infrared images using saliency analysis and detail preserving based image decomposition,” Infrared Physics and Technology, 2013, 56(2): 93-99.

    [13] R. Shen, I. Cheng, and A. Basu, “Cross-scale coefficient selection for volumetric medical image fusion,” IEEE Transactions on Biomedical Engineering, 2013, 60(4): 1069-1079.

    [14] X. Z. Bai, F. G. Zhou, and B. D. Xue, “Fusion of infrared and visual images through region extraction by using multi scale center-surround top-hat transform,” Optics Express, 2011, 19(9): 8444-8457.

    [15] S. G. Kong, J. Heo, F. Boughorbel, Y. Zheng, B. Abidi, A. Koschan, et al., “Multiscale fusion of visible and thermal IR images for illumination-invariant face recognition,” International Journal of Computer Vision, 2007, 71(2): 215-233.

    [16] D. M. Bulanona, T. F. Burksa, and V. Alchanatis, “Image fusion of visible and thermal images for fruit detection,” Biosystems Engineering, 2009, 103(1): 12-22.

    [17] D. P. Bavirisetti and R. Dhuli, “Fusion of infrared and visible sensor images based on anisotropic diffusion and karhunen-loeve transform,” IEEE Sensors Journal, 2016, 16(1): 203-209.

    [18] C. Beyan, A. Yigit, and A. Temizel, “Fusion of thermal-and visible-band video for abandoned object detection,” Journal of Electronic Imaging, 2011, 20(3): 033001-1-033001-13.

    [19] T. Alexander and A. H. Maarten, “Portable real-time color night vision,” SPIE, 2008, 69(74): 697402-1-697402-12.

    [20] T. Alexander and A. H. Maarten, “Progress in color night vision,” Optical Engineering, 2012, 51(1): 010901-1-010901-19.

    [21] A. Toet, M. A. Hogervorst, R. V. Son, and J. Dijk, “Augmenting full color fused multiband night vision imagery with synthetic imagery for enhanced situational awareness,” International Journal of Image and Data Fusion, 2011, 2(4): 287-308.

    [22] N. R. Nelson and P. S. Barry, “Measurement of Hyperion MTF from on-orbit scenes,” in Proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS '01), Sydney, Australia, 2001, pp. 2967-2969.

    [23] H. Du and K. J. Voss, “Effects of point-spread function on calibration and radiometric accuracy of CCD camera,” Applied Optics, 2004, 43(3): 665-670.

    [24] F. Bu, “Study on modeling and simulation of optical remote sensing system and image processing technology,” Ph.D. dissertation, The University of Chinese Academy of Sciences, Beijing, China, 2014.

    [25] B. Ding, “Hyperspectral imaging system model implementation and analysis,” Ph.D. dissereation, Chester F. Carlson Center for Imaging Science Rochester Institute of Technology, New York, the United States, 2014.

    Tiezhu QIAO, Lulu CHEN, Yusong PANG, Gaowei YAN. Integrative Multi-Spectral Sensor Device for Far-Infrared and Visible Light Fusion[J]. Photonic Sensors, 2018, 8(2): 134
    Download Citation