• Journal of Electronic Science and Technology
  • Vol. 22, Issue 3, 100260 (2024)
Jiang Wu1,2, Yi Shi2, Shun Yan3, and Hong-Mei Yan1,2,*
Author Affiliations
  • 1Yangtze Delta Region Institute (Huzhou), University of Electronic Science and Technology of China, Huzhou, 313001, China
  • 2MOE Key Laboratory for Neuroinformation, School of Life Science and Technology, University of Electronic Science and Technology of China, Chengdu, 611731, China
  • 3College of Engineering, University of California at Santa Barbara, Santa Barbara, 93106, USA
  • show less
    DOI: 10.1016/j.jnlest.2024.100260 Cite this Article
    Jiang Wu, Yi Shi, Shun Yan, Hong-Mei Yan. Global-local combined features to detect pain intensity from facial expression images with attention mechanism[J]. Journal of Electronic Science and Technology, 2024, 22(3): 100260 Copy Citation Text show less

    Abstract

    The estimation of pain intensity is critical for medical diagnosis and treatment of patients. With the development of image monitoring technology and artificial intelligence, automatic pain assessment based on facial expression and behavioral analysis shows a potential value in clinical applications. This paper reports a framework of convolutional neural network with global and local attention mechanism (GLA-CNN) for the effective detection of pain intensity at four-level thresholds using facial expression images. GLA-CNN includes two modules, namely global attention network (GANet) and local attention network (LANet). LANet is responsible for extracting representative local patch features of faces, while GANet extracts whole facial features to compensate for the ignored correlative features between patches. In the end, the global correlational and local subtle features are fused for the final estimation of pain intensity. Experiments under the UNBC-McMaster Shoulder Pain database demonstrate that GLA-CNN outperforms other state-of-the-art methods. Additionally, a visualization analysis is conducted to present the feature map of GLA-CNN, intuitively showing that it can extract not only local pain features but also global correlative facial ones. Our study demonstrates that pain assessment based on facial expression is a non-invasive and feasible method, and can be employed as an auxiliary pain assessment tool in clinical practice.
    $ \theta ={\mathrm{t}\mathrm{a}\mathrm{n}}^{-1}\frac{\mathrm{d}y}{\mathrm{d}x} $(1)

    View in Article

    $ {\mathbf{w}}_{i}=\sigma \left({\omega }_{i}\right)\times {{\boldsymbol{\mathbm{β}}}}_{i} $(2)

    View in Article

    $ {f}_{L}\left(\omega \right)={\displaystyle\cup }_{i=1}^{25}{\mathbf{w}}_{i} $(3)

    View in Article

    $ y_k=\frac{1}{h\times w}\sum_{i=1}^h\sum_{{{j}}=1}^wm_k(i\mathrm{,}\ j) $(4)

    View in Article

    $ {\mathbf{M}}''=\mathbf{M}\text{⨁}\left(\mathbf{V}\text{⨂}{\mathbf{M}'}\right)$(5)

    View in Article

    $ {L}_{k}=-\frac{1}{N}\displaystyle\sum _{i=0}^{N-1}\mathrm{l}\mathrm{o}\mathrm{g}\frac{{e}^{{\mathbf{W}}_{{y}_{i}}^{\left(k\right)\mathrm{T}}{v}_{i}^{k}+{b}_{i}^{\left(k\right)}}}{\displaystyle\sum _{j=0}^{C-1}{e}^{{\mathbf{W}}_{j}^{\left(k\right)\mathrm{T}}{v}_{i}^{k}+{b}_{j}^{\left(k\right)}}} $(6)

    View in Article

    Jiang Wu, Yi Shi, Shun Yan, Hong-Mei Yan. Global-local combined features to detect pain intensity from facial expression images with attention mechanism[J]. Journal of Electronic Science and Technology, 2024, 22(3): 100260
    Download Citation