• Opto-Electronic Engineering
  • Vol. 52, Issue 1, 240238 (2025)
Zhongmin Liu1,*, Fujun Yang1, and Wenjin Hu2
Author Affiliations
  • 1Department of Electrical Engineering and Information Engineering, Lanzhou University of Technology, Lanzhou, Gansu 730050, China
  • 2College of Mathematics and Computer Science, Northwest Minzu University, Lanzhou, Gansu 730030, China
  • show less
    DOI: 10.12086/oee.2025.240238 Cite this Article
    Zhongmin Liu, Fujun Yang, Wenjin Hu. Multi-scale feature interaction pseudo-label unsupervised domain adaptation for person re-identification[J]. Opto-Electronic Engineering, 2025, 52(1): 240238 Copy Citation Text show less

    Abstract

    To address issues of insufficient receptive fields and weak connections between global and local features in unsupervised domain adaptive person re-identification, a multi-scale feature interaction method was proposed. Firstly, the feature squeeze attention mechanism compressed image features, which were then fed into the network to enhance rich local information representation. Secondly, the residual feature interaction module encoded global information into the features by interaction, while increasing the model's receptive field and enhancing its ability to extract pedestrian features. Finally, a bottleneck module based on partial convolution conducted convolution operations on the part of the input channels, reducing redundant computations and improving spatial feature extraction efficiency. Experimental results on three adaptation datasets demonstrate that the method mAP reached 82.9%, 68.7%, and 26.6%, the Rank-1 reached 93.7%, 82.7%, and 54.7%, the Rank-5 reached 97.4%, 89.9%, and 67.5%, by comparison with baseline, respectively, demonstrating that the proposed method allows for better pedestrian features representation and improved recognition accuracy.
    Zhongmin Liu, Fujun Yang, Wenjin Hu. Multi-scale feature interaction pseudo-label unsupervised domain adaptation for person re-identification[J]. Opto-Electronic Engineering, 2025, 52(1): 240238
    Download Citation