In this paper, we propose an interoceptive-only odometry system for ground robots with neural network processing and soft constraints based on the assumption of a globally continuous ground manifold. Exteroceptive sensors such as cameras, GPS and LiDAR may encounter difficulties in scenarios with poor illumination, indoor environments, dusty areas and straight tunnels. Therefore, improving the pose estimation accuracy only using interoceptive sensors is important to enhance the reliability of navigation system even in degrading scenarios mentioned above. However, interoceptive sensors like IMU and wheel encoders suffer from large drift due to noisy measurements. To overcome these challenges, the proposed system trains deep neural networks to correct the measurements from IMU and wheel encoders, while considering their uncertainty. Moreover, because ground robots can only travel on the ground, we model the ground surface as a globally continuous manifold using a dual cubic B-spline manifold to further improve the estimation accuracy by this soft constraint. A novel space-based sliding-window filtering framework is proposed to fully exploit the $C^2$ continuity of ground manifold soft constraints and fuse all the information from raw measurements and neural networks in a yaw-independent attitude convention. Extensive experiments demonstrate that our proposed approach can outperform state-of-the-art learning-based interoceptive-only odometry methods.
翻译:本文提出了一种仅依赖内感传感器的地面机器人里程计系统,该系统采用神经网络处理技术,并基于全局连续地面流形的假设引入软约束。相机、GPS和激光雷达等外感传感器在光照不足、室内环境、粉尘区域及直线隧道等场景中可能面临挑战。因此,仅通过内感传感器提升位姿估计精度对于增强导航系统在上述退化场景中的可靠性至关重要。然而,惯性测量单元(IMU)与轮式编码器等内感传感器因测量噪声易产生显著漂移。为应对这些挑战,所提出的系统通过训练深度神经网络来校正IMU与轮式编码器的测量值,同时考虑其不确定性。此外,由于地面机器人仅能在地面运动,我们采用双三次B样条流形将地面建模为全局连续流形,通过该软约束进一步提升估计精度。本文提出了一种新颖的基于空间的滑动窗口滤波框架,以充分利用地面流形软约束的$C^2$连续性,并在与偏航角无关的姿态表示下融合原始测量值与神经网络输出的全部信息。大量实验表明,我们提出的方法能够超越当前最先进的仅依赖内感传感器的学习型里程计方法。