• Acta Optica Sinica
  • Vol. 37, Issue 11, 1115001 (2017)
Zhilin Lin, Guoliang Zhang*, Erliang Yao, and Hui Xu
Author Affiliations
  • Department of Control Science and Engineering, Rocket Force Engineering University, Xi'an, Shaanxi 710025, China
  • show less
    DOI: 10.3788/AOS201737.1115001 Cite this Article Set citation alerts
    Zhilin Lin, Guoliang Zhang, Erliang Yao, Hui Xu. Stereo Visual Odometry Based on Motion Object Detection in the Dynamic Scene[J]. Acta Optica Sinica, 2017, 37(11): 1115001 Copy Citation Text show less

    Abstract

    In order to improve the robustness and accuracy of the visual odometry in the dynamic scene, a stereo visual odometry based on moving object detection is proposed. Firstly, a scene flow calculation model considering the camera pose is established to represent the motion vector of the objects. Secondly, the method of constructing virtual map points is proposed. On the one hand, the motion object detection can be complied according to the virtual map points and the scene flow, on the other hand, the virtual map points ensure that there are still enough matched point pairs to estimate the pose when the proportion of the moving objects in the image is too large. Finally, the feature points in the current frame will be matched with the local map points and the virtual map points, and according to the matching results, the nonlinear optimization model considering the virtual points is constructed to estimate the camera pose. It can not only ensure that the static map points do not match with the feature points of the motion object, but also avoid the failure of the visual odometry when the effective point pairs are too few. The results of the dataset experiment and the online experiment in the actual scene show that the proposed method improves the robustness and accuracy of the visual odometry in the dynamic scene.
    Zhilin Lin, Guoliang Zhang, Erliang Yao, Hui Xu. Stereo Visual Odometry Based on Motion Object Detection in the Dynamic Scene[J]. Acta Optica Sinica, 2017, 37(11): 1115001
    Download Citation