Fast flights with aggressive maneuvers in cluttered GNSS-denied environments require fast, reliable, and accurate UAV state estimation. In this paper, we present an approach for onboard state estimation of a high-speed UAV using a monocular RGB camera and an IMU. Our approach fuses data from Visual-Inertial Odometry (VIO), an onboard landmark-based camera measurement system, and an IMU to produce an accurate state estimate. Using onboard measurement data, we estimate and compensate for VIO drift through a novel mathematical drift model. State-of-the-art approaches often rely on more complex hardware (e.g., stereo cameras or rangefinders) and use uncorrected drifting VIO velocities, orientation, and angular rates, leading to errors during fast maneuvers. In contrast, our method corrects all VIO states (position, orientation, linear and angular velocity), resulting in accurate state estimation even during rapid and dynamic motion. Our approach was thoroughly validated through 1600 simulations and numerous real-world experiments. Furthermore, we applied the proposed method in the A2RL Drone Racing Challenge 2025, where our team advanced to the final four out of 210 teams and earned a medal.
翻译:在杂乱且无GNSS信号的环境中执行包含激进机动的快速飞行,需要快速、可靠且准确的无人机状态估计。本文提出一种利用单目RGB相机与IMU实现高速无人机机载状态估计的方法。该方法融合了视觉惯性里程计(VIO)、基于机载地标的相机测量系统以及IMU的数据,以生成精确的状态估计。利用机载测量数据,我们通过一种新颖的数学漂移模型来估计并补偿VIO的漂移。现有先进方法通常依赖更复杂的硬件(如立体相机或测距仪),并使用未经校正的、存在漂移的VIO速度、姿态与角速率,导致在快速机动过程中产生误差。相比之下,我们的方法校正了VIO的所有状态(位置、姿态、线速度与角速度),从而即使在快速动态运动期间也能实现准确的状态估计。我们通过1600次仿真和大量真实世界实验对所提方法进行了全面验证。此外,我们将该方法应用于2025年A2RL无人机竞速挑战赛,我们的团队从210支参赛队伍中脱颖而出,晋级最终四强并荣获奖牌。