• Journal of Terahertz Science and Electronic Information Technology
  • Vol. 20, Issue 10, 1038 (2022)
ZHU Tao1、2、*, MA Huimin3, CHAI Houqing2, and ZHANG Shenghu1
Author Affiliations
  • 1[in Chinese]
  • 2[in Chinese]
  • 3[in Chinese]
  • show less
    DOI: 10.11805/tkyda2020336 Cite this Article
    ZHU Tao, MA Huimin, CHAI Houqing, ZHANG Shenghu. Detection and processing of abnormal visual measurements in Visual-Inertial Odometry[J]. Journal of Terahertz Science and Electronic Information Technology , 2022, 20(10): 1038 Copy Citation Text show less

    Abstract

    For Visual-Inertial Odometry(VIO), complex scenes, such as visual occlusion and moving objects may bring about abnormal visual measurements, which leads to dramatically drop of the positioning accuracy. To this end, a new method is proposed for detecting and processing abnormal visual measurements in VIO. Firstly, by selecting the detection index, setting the prior threshold and designing the detection classifier, the classification and detection of abnormal visual measurement are realized. After that, a multi-sensor fusion strategy and an adaptive error weighting algorithm are proposed, and these algorithms timely eliminate the influence of abnormal visual measurement which is inconsistent with the actual motion. Finally, the detection and processing algorithm of abnormal visual measurement are integrated into Open Keyframe-based Visual-Inertial SLAM(OKVIS), and the Error Detection and Solution of Visual-Inertial Odometry(EDS-VIO) framework is proposed. Evaluation results on the complex scene simulated dataset show that, compared with OKVIS, EDS-VIO has achieved better performance on the dataset, the average location error has been reduced from 1.045 m to 0.437 m. EDS-VIO improves the positioning accuracy and robustness of the VIO in complex scenes.
    ZHU Tao, MA Huimin, CHAI Houqing, ZHANG Shenghu. Detection and processing of abnormal visual measurements in Visual-Inertial Odometry[J]. Journal of Terahertz Science and Electronic Information Technology , 2022, 20(10): 1038
    Download Citation