• Journal of Terahertz Science and Electronic Information Technology
  • Vol. 21, Issue 2, 216 (2023)
LIU Kaiming1、2、* and ZHANG Jin3
Author Affiliations
  • 1[in Chinese]
  • 2[in Chinese]
  • 3[in Chinese]
  • show less
    DOI: 10.11805/tkyda2020472 Cite this Article
    LIU Kaiming, ZHANG Jin. Image matching method based on variance constraint coupled geometric invariance[J]. Journal of Terahertz Science and Electronic Information Technology , 2023, 21(2): 216 Copy Citation Text show less

    Abstract

    Currently a number of image matching algorithms focus on measuring the distance between key points while ignoring the structure information of the images, hence are prone to mismatches. This paper presents an image matching algorithm using the geometric invariance of variance constraint coupling. With the help of Forstner operator, the interest value of pixels is calculated to detect the characteristics of the image. The gradient information of the image is calculated to obtain the direction value of the image. The circular neighborhood of image features is cut to obtain the fan-shaped sub domains. Based on the direction value of the image, the feature vector of the image feature is obtained by calculating the gray invariant moment in the fan-shaped sub domains. The region variance function is introduced to obtain the structure information of the image, which is added to the image feature matching process to constrain the results of Euclidean distance measurement and realize the image feature matching. Based on the geometric invariance between matching points, the matching image features are processed to get accurate image matching results. Experimental results show that compared with the existing matching techniques, this algorithm has higher matching accuracy, which up to 96.56%, 95.38% and 93.52% for non-transformed images, zoomed images and rotated images, respectively.
    LIU Kaiming, ZHANG Jin. Image matching method based on variance constraint coupled geometric invariance[J]. Journal of Terahertz Science and Electronic Information Technology , 2023, 21(2): 216
    Download Citation