Event cameras are bio-inspired vision sensors that asynchronously measure per-pixel brightness changes.The high-temporal resolution and asynchronicity of event cameras offer great potential for estimating robot motion states. Recent works have adopted the continuous-time estimation methods to exploit the inherent nature of event cameras. However, existing methods either have poor runtime performance or neglect the high-temporal resolution of event cameras. To alleviate it, an Asynchronous Event-driven Visual Odometry (AsynEVO) based on sparse Gaussian Process (GP) regression is proposed to efficiently infer the motion trajectory from pure event streams. Concretely, an asynchronous frontend pipeline is designed to adapt event-driven feature tracking and manage feature trajectories; a parallel dynamic sliding-window backend is presented within the framework of sparse GP regression on $SE(3)$. Notably, a dynamic marginalization strategy is employed to ensure the consistency and sparsity of this GP regression. Experiments conducted on public datasets and real-world scenarios demonstrate that AsynEVO achieves competitive precision and superior robustness compared to the state-of-the-art.The experiment in the repeated-texture scenario indicates that the high-temporal resolution of AsynEVO plays a vital role in the estimation of high-speed movement. Furthermore, we show that the computational efficiency of AsynEVO significantly outperforms the incremental method.
翻译:事件相机是一种仿生视觉传感器,可异步测量像素级的亮度变化。事件相机的高时间分辨率与异步特性为估计机器人运动状态提供了巨大潜力。近期研究采用连续时间估计方法以利用事件相机的固有特性。然而,现有方法要么运行时性能较差,要么忽略了事件相机的高时间分辨率特性。为缓解此问题,本文提出一种基于稀疏高斯过程回归的异步事件驱动视觉里程计方法,可从纯事件流中高效推断运动轨迹。具体而言:设计异步前端流程以适应事件驱动的特征追踪并管理特征轨迹;在$SE(3)$流形上的稀疏高斯过程回归框架内提出并行动态滑动窗口后端。值得注意的是,该方法采用动态边缘化策略以保证高斯过程回归的一致性与稀疏性。在公开数据集和真实场景中的实验表明,与现有先进方法相比,AsynEVO在达到相当精度的同时展现出更优的鲁棒性。在重复纹理场景中的实验表明,AsynEVO的高时间分辨率特性对高速运动估计具有关键作用。此外,我们证明AsynEVO的计算效率显著优于增量式方法。