• Laser & Optoelectronics Progress
  • Vol. 58, Issue 8, 0810019 (2021)
Chengshuo Cao and Jie Yuan*
Author Affiliations
  • School of Electrical Engineering, Xinjiang University, Urumchi, Xinjiang 830047, China
  • show less
    DOI: 10.3788/LOP202158.0810019 Cite this Article Set citation alerts
    Chengshuo Cao, Jie Yuan. Mask-Wearing Detection Method Based on YOLO-Mask[J]. Laser & Optoelectronics Progress, 2021, 58(8): 0810019 Copy Citation Text show less
    References

    [1] Xia S, Li J, Ni Z L. The impact of the new crown pneumonia epidemic on my country's social and economic development[J]. Financial Supervision, 5-9(2020).

    [2] Yu Q M, Zheng D H. Study on the normalization of epidemic prevention and control of the corona virus disease 2019[J]. China Public Security (Academy Edition), 65-68(2020).

    [3] Zhou Y P, Jiang Y, Rao H et al. Investigation on status quo using masks among the public during the outbreak of COVID-19[J]. Chinese Nursing Research, 34, 2041-2044(2020).

    [4] Niu Z D, Qin T, Li H D et al. Improved algorithm of RetinaFace for natural scene mask wear detection[J]. Computer Engineering and Applications, 56, 1-7(2020).

    [5] Girshick R, Donahue J, Darrell T et al. Rich feature hierarchies for accurate object detection and semantic segmentation[C]. //Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 23-28, 2014, Columbus, OH, USA., 580-587(2014).

    [6] Girshick R. Fast R-CNN[C]. //2015 IEEE International Conference on Computer Vision (ICCV), December 7-13, 2015, Santiago, Chile., 1440-1448(2015).

    [7] Ren S Q, He K M, Girshick R et al. Faster R-CNN: towards real-time object detection with region proposal networks[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39, 1137-1149(2017).

    [8] Liu W, Anguelov D, Erhan D et al. SSD: single shot MultiBox detector[M]. //Computer Vision-ECCV 2016. Cham: Springer, 21-37(2016).

    [9] Redmon J, Divvala S, Girshick R et al. You only look once: unified, real-time object detection[C]. //2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 27-30, 2016, Las Vegas, NV, USA, 779-788(2016).

    [10] Redmon J, Farhadi A. YOLO9000: better, faster, stronger. //2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), July 21-26, 2017, Honolulu, HI, USA, 6517-6525(2017).

    [11] Redmon J, Farhadi A. YOLOV3: an incremental improvement[C]. //2018 IEEE Conference on Computer Vision and Pattern Recognition. Salt Lake City, UT, 1-6(2018).

    [12] Lin T Y, Dollár P, Girshick R et al. Feature pyramid networks for object detection[C]. //2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), July 21-26, 2017, Honolulu, HI, USA., 936-944(2017).

    [13] Guo J X, Liu L B, Xu F et al. Airportscene aircraft detection method based on YOLOv3[J]. Laser & Optoelectronics Progress, 56, 191003(2019).

    [14] Lyu S, Cai X, Feng R. YOLOv3 network based on improved loss function[J]. Computer Systems & Applications, 28, 1-7(2019).

    [15] Li H B, Xu C Y, Hu C C. Improved real-time vehicle detection method based on YOLOv3[J]. Laser & Optoelectronics Progress, 57, 101507(2020).

    [16] Zhou J, Jing J F, Zhang H H et al. Real-time fabric defect detection algorithm based on S-YOLOv3 model[J]. Laser & Optoelectronics Progress, 57, 161001(2020).

    [17] Hu J, Sun G, Wu E H et al. Squeeze and excitation networks[C]. //2018 IEEE Conference on Computer Vision and Pattern Recognition. Salt Lake City, UT., 7132-7141(2018).

    [18] Liu S, Qi L, Qin H F et al. Path aggregation network for instance segmentation[C]. //2018 IEEE Conference on Computer Vision and Pattern Recognition. Salt Lake City, UT., 8759-8768(2018).

    [19] Zheng Z H, Wang P, Liu W et al. Distance-IoU Loss: faster and better learning for bounding box regression[C]. //Proceedings of the 2020 AAAI Conference on Artificial Intelligence, February 7-12, 2020, Hilton Midtown, New York(2020).

    Chengshuo Cao, Jie Yuan. Mask-Wearing Detection Method Based on YOLO-Mask[J]. Laser & Optoelectronics Progress, 2021, 58(8): 0810019
    Download Citation