manuscript-TIE_20211027.pdf (877.45 kB)
Download file

Learning-Type Anchors-Driven Pose Estimation for the Autolanding Fixed-Wing UAVs

Download (877.45 kB)
posted on 2021-11-05, 18:01 authored by Dengqing TangDengqing Tang, Lincheng Shen, Xiaojiao Xiang, Han Zhou, Tianjiang Hu

We propose a learning-type anchors-driven real-time pose estimation method for the autolanding fixed-wing unmanned aerial vehicle (UAV). The proposed method enables online tracking of both position and attitude by the ground stereo vision system in the Global Navigation Satellite System denied environments. A pipeline of convolutional neural network (CNN)-based UAV anchors detection and anchors-driven UAV pose estimation are employed. To realize robust and accurate anchors detection, we design and implement a Block-CNN architecture to reduce the impact of the outliers. With the basis of the anchors, monocular and stereo vision-based filters are established to update the UAV position and attitude. To expand the training dataset without extra outdoor experiments, we develop a parallel system containing the outdoor and simulated systems with the same configuration. Simulated and outdoor experiments are performed to demonstrate the remarkable pose estimation accuracy improvement compared with the conventional Perspective-N-Points solution. In addition, the experiments also validate the feasibility of the proposed architecture and algorithm in terms of the accuracy and real-time capability requirements for fixed-wing autolanding UAVs.


Email Address of Submitting Author

Submitting Author's Institution

National University of Defense Technology

Submitting Author's Country


Usage metrics