Event cameras are bio-inspired vision sensors that asynchronously measure per-pixel brightness changes. The high temporal resolution and asynchronicity of event cameras offer great potential for estimating the robot motion state. Recent works have adopted the continuous-time ego-motion estimation methods to exploit the inherent nature of event cameras. However, most of the adopted methods have poor real-time performance. To alleviate it, a lightweight Gaussian Process (GP)-based estimation framework is proposed to efficiently estimate motion trajectory from asynchronous event-driven data associations. Concretely, an asynchronous front-end pipeline is designed to adapt event-driven feature trackers and generate feature trajectories from event streams; a parallel dynamic sliding-window back-end is presented within the framework of sparse GP regression on SE(3). Notably, a specially designed state marginalization strategy is employed to ensure the consistency and sparsity of this GP regression. Experiments conducted on synthetic and real-world datasets demonstrate that the proposed method achieves competitive precision and superior robustness compared to the state-of-the-art. Furthermore, the evaluations on three 60 s trajectories show that the proposal outperforms the ISAM2-based method in terms of computational efficiency by 2.64, 4.22, and 11.70 times, respectively.
翻译:事件相机是一种受生物启发的视觉传感器,可异步测量每个像素的亮度变化。事件相机的高时间分辨率和异步特性为估计机器人运动状态提供了巨大潜力。近期研究采用连续时间自运动估计方法来充分利用事件相机的固有特性,然而大多数现有方法实时性较差。为解决这一问题,本文提出一种轻量级基于高斯过程的估计框架,以从异步事件驱动数据关联中高效估计运动轨迹。具体而言,设计了一种异步前端处理流水线,用于适配事件驱动特征追踪器并从事件流生成特征轨迹;此外,在SE(3)上的稀疏高斯过程回归框架内,提出了一种并行动态滑动窗口后端。值得注意的是,采用了一种专门设计的状态边缘化策略,以确保该高斯过程回归的一致性与稀疏性。在合成数据集和真实世界数据集上的实验表明,与现有最优方法相比,所提方法在精度上具有竞争力且在鲁棒性上更优。此外,对三条60秒轨迹的评估显示,本方法在计算效率上分别比基于ISAM2的方法提升2.64倍、4.22倍和11.70倍。