• Infrared and Laser Engineering
  • Vol. 45, Issue 7, 726005 (2016)
Huang Nannan*, Liu Guixi, Zhang Yinzhe, and Yao Liyang
Author Affiliations
  • [in Chinese]
  • show less
    DOI: 10.3788/irla201645.0726005 Cite this Article
    Huang Nannan, Liu Guixi, Zhang Yinzhe, Yao Liyang. Unmanned aerial vehicle vision navigation algorithm[J]. Infrared and Laser Engineering, 2016, 45(7): 726005 Copy Citation Text show less

    Abstract

    In order to ensure accuracy and security of unmanned aerial vehicle(UAV)landing, a UAV autonomous landing with visual navigation pose parameters calculateion method was proposed. Firstly, the airborne camera was calibrated to get the camera parameters, then the important influence of landmark shape and size, angular point geometry distribution and number of points on pose estimation precision were considered, a "T" type landing landmark was designed with given size parameters, landmark contour extraction with corner detection algorithm was combined to get eight corners with good geometric distribution and the number was reasonable for pose estimation to guarantee the posture calculation accuracy. To reduce the processing time of Lucas-Kanade(LK) optical flow method tracking landmarks stably, the extracted eight corners were used as LK optical flow method input to detect and track, ensuring real-time performance of the algorithm. Finally, real-time flight pose parameters of UAV through the projection relationship between 3D space and 2D image plane were estimated. The results of simulation experiment show that the algorithm has high precision,and the average period is 76.756 ms (about 13 frames per second). The real-time requirements of visual aided navigation of autonomous landing at low speeds of landing stage is satisfied basically.
    Huang Nannan, Liu Guixi, Zhang Yinzhe, Yao Liyang. Unmanned aerial vehicle vision navigation algorithm[J]. Infrared and Laser Engineering, 2016, 45(7): 726005
    Download Citation