• Laser & Optoelectronics Progress
  • Vol. 56, Issue 24, 241501 (2019)
Hengjie Yang, Zheng Yan, Zongling Wu, Dingbang Fang, and Fang Duan*
Author Affiliations
  • College of Information Science and Engineering, Huaqiao University, Xiamen, Fujian 361021, China
  • show less
    DOI: 10.3788/LOP56.241501 Cite this Article Set citation alerts
    Hengjie Yang, Zheng Yan, Zongling Wu, Dingbang Fang, Fang Duan. Extraction Method of Interest Text in Image Based on Recurrent Neural Network[J]. Laser & Optoelectronics Progress, 2019, 56(24): 241501 Copy Citation Text show less

    Abstract

    It is difficult to recognize a certain text of interest in the image using the optical character recognition (OCR) method; particularly in natural scenes, the recognition results usually contain a large number of noisy texts. To address this problem, a model termed bidirectional long short term memory-condition random field (BLSTM-CRF) based on a recurrent neural network for extracting texts of interest is proposed in this study. First, a BLSTM network is implemented to capture the context information of the sequence obtained by the OCR method, thereby obtaining feature sequences. Second, the relationships between the model features and tags are established by introducing the CRF. Then the text of interest can be obtained through the tags. Experimental results indicate that the proposed method can achieve an accuracy of 88.52% on YNIDREAL dataset. Compared with the CRF model, the accuracy of the proposed method is improved by 16.39 percentage points, which proves the feasibility and robustness of the proposed method.
    Hengjie Yang, Zheng Yan, Zongling Wu, Dingbang Fang, Fang Duan. Extraction Method of Interest Text in Image Based on Recurrent Neural Network[J]. Laser & Optoelectronics Progress, 2019, 56(24): 241501
    Download Citation