• Laser & Optoelectronics Progress
  • Vol. 60, Issue 6, 0615002 (2023)
Yifei Chen, Yaoyi Cai*, and Shiwen Li
Author Affiliations
  • College of Engineering and Design, Hunan Normal University, Changsha 410083, Hunan, China
  • show less
    DOI: 10.3788/LOP213293 Cite this Article Set citation alerts
    Yifei Chen, Yaoyi Cai, Shiwen Li. Working Condition Recognition Based on Lightweight Convolution Vision Transformer Network for Antimony Flotation Process[J]. Laser & Optoelectronics Progress, 2023, 60(6): 0615002 Copy Citation Text show less

    Abstract

    It is highly subjective and has a large error to identify antimony flotation conditions by manually observing the characteristics of antimony flotation foam, which seriously restricts the flotation performance. The recognition method based on computer vision has low cost and good effect. In view of the above problems, a recognition method of antimony flotation conditions based on light-weight convolutional visual Transformer (L-CVT) is proposed. The stack of transformer layers replaces matrix multiplication in standard convolution to learn global information, replaces local modeling in convolution with global modeling, and introduces submodules in the lightweight neural network MobileNetv2 to reduce computational costs. The proposed method solves the problem that convolutional neural network (CNN) ignores the long-distance dependence within flotation images, and makes up for the lack of inductive bias of visual Transformer (VIT). The experimental results show that the accuracy of antimony flotation condition identification based on the proposed method can reach 93.56%, which is significantly higher than VGG16, ResNet18, AlexNet and other mainstream networks. It provides an important reference for antimony flotation data in the field of condition identification.
    Yifei Chen, Yaoyi Cai, Shiwen Li. Working Condition Recognition Based on Lightweight Convolution Vision Transformer Network for Antimony Flotation Process[J]. Laser & Optoelectronics Progress, 2023, 60(6): 0615002
    Download Citation