loading page

Learning-Type Anchors-Driven Pose Estimation for the Autolanding Fixed-Wing UAVs
  • +2
  • Dengqing Tang ,
  • Lincheng Shen ,
  • Xiaojiao Xiang ,
  • Han Zhou ,
  • Tianjiang Hu
Dengqing Tang
National University of Defense Technology

Corresponding Author:[email protected]

Author Profile
Lincheng Shen
Author Profile
Xiaojiao Xiang
Author Profile
Tianjiang Hu
Author Profile

Abstract

We propose a learning-type anchors-driven real-time pose estimation method for the autolanding fixed-wing unmanned aerial vehicle (UAV). The proposed method enables online tracking of both position and attitude by the ground stereo vision system in the Global Navigation Satellite System denied environments. A pipeline of convolutional neural network (CNN)-based UAV anchors detection and anchors-driven UAV pose estimation are employed. To realize robust and accurate anchors detection, we design and implement a Block-CNN architecture to reduce the impact of the outliers. With the basis of the anchors, monocular and stereo vision-based filters are established to update the UAV position and attitude. To expand the training dataset without extra outdoor experiments, we develop a parallel system containing the outdoor and simulated systems with the same configuration. Simulated and outdoor experiments are performed to demonstrate the remarkable pose estimation accuracy improvement compared with the conventional Perspective-N-Points solution. In addition, the experiments also validate the feasibility of the proposed architecture and algorithm in terms of the accuracy and real-time capability requirements for fixed-wing autolanding UAVs.