• Optoelectronics Letters
  • Vol. 17, Issue 9, 564 (2021)
Sheng LIU*, Jiayu SHEN, and Shengyue HUANG
Author Affiliations
  • Collage of Computer Science and Technology, Zhejiang University of Technology, Hangzhou 310023, China
  • show less
    DOI: 10.1007/s11801-021-1005-6 Cite this Article
    LIU Sheng, SHEN Jiayu, HUANG Shengyue. Object detection in seriously degraded images with unbalanced training samples[J]. Optoelectronics Letters, 2021, 17(9): 564 Copy Citation Text show less

    Abstract

    Uncertain environments, especially uneven lighting and shadows, can degrade an image, which causes a great negative impact on object detection. Moreover, unbalanced training samples can cause overfitting problem. Since available data that is collected at night is much rarer than that collected in the daytime, the nighttime detection effect will be relatively poor. In this paper, we propose a novel data augmentation method named Mask Augmentation, which reduces the brightness and contrast of objects, and also weakens the edge of objects to simulate the degraded scene. In addition, we propose a new architecture, by adding a classification loss branch and a feature extraction module named Multi- Feature Attention Module, which combines the attention mechanism and feature fusion on the basis of Darknet-53. This architecture makes the features extracted in daytime and nighttime images distinguishable. We also increase the loss weight of nighttime images during the training process. We achieved 78.68% mAP on nighttime detection and 73.14% mAP on daytime detection. Compared with other models, our method greatly improves the accuracy of nighttime detection, and also performs satisfactorily on daytime detection. We deployed our model on an intelligent garbage collection robot for real-time detection, which implements automatic picking at night and assists cleaning staff during the day.
    LIU Sheng, SHEN Jiayu, HUANG Shengyue. Object detection in seriously degraded images with unbalanced training samples[J]. Optoelectronics Letters, 2021, 17(9): 564
    Download Citation