• Laser & Optoelectronics Progress
  • Vol. 56, Issue 23, 231009 (2019)
Zhihua Qu**, Yiming Shao, Tianmin Deng*, Jie Zhu, and Xiaohua Song
Author Affiliations
  • School of Traffic and Transportation, Chongqing Jiaotong University, Chongqing 400074, China
  • show less
    DOI: 10.3788/LOP56.231009 Cite this Article Set citation alerts
    Zhihua Qu, Yiming Shao, Tianmin Deng, Jie Zhu, Xiaohua Song. Traffic Sign Detection and Recognition Under Complicated Lighting Conditions[J]. Laser & Optoelectronics Progress, 2019, 56(23): 231009 Copy Citation Text show less

    Abstract

    Herein, we investigate solutions to address various problems, including the low detection precision and leak detection of the traffic signs, associated with the major detection algorithms under conditions of low illumination or intense variation of lighting. We propose an improved integrated Adaboost algorithm based on multicomponent transformation of the characteristics of key points of image to reduce the sensitivity of a sample image to illumination variation. The proposed algorithm extracts the key points of image and builds a weak classifier to reinforce the anti-disturbance ability of the algorithm under conditions of noise and partial obscurity. Meanwhile, the multi-scale feature fusion algorithm is used to classify and recognize the traffic signs. Furthermore, the German traffic sign datasets (the GTSDB and GTSRB datasets, respectively) and the self-built dataset are used to verify the performance of the proposed algorithm. The results denote that the proposed algorithm exhibits the highest detection and recognition rates when compared to other existing algorithms based on these three datasets. For the images of traffic signs under low illumination, the detection accuracy of proposed algorithm is 94.96%, indicating good robustness in complicated lighting environments.
    Zhihua Qu, Yiming Shao, Tianmin Deng, Jie Zhu, Xiaohua Song. Traffic Sign Detection and Recognition Under Complicated Lighting Conditions[J]. Laser & Optoelectronics Progress, 2019, 56(23): 231009
    Download Citation