When legged robots perform agile movements, traditional RGB cameras often produce blurred images, posing a challenge for accurate state estimation. Event cameras, inspired by biological vision mechanisms, have emerged as a promising solution for capturing high-speed movements and coping with challenging lighting conditions, owing to their significant advantages, such as low latency, high temporal resolution, and a high dynamic range. However, the integration of event cameras into agile-legged robots is still largely unexplored. Notably, no event camera-based dataset has yet been specifically developed for dynamic legged robots. To bridge this gap, we introduce EAGLE (Event dataset of an AGile LEgged robot), a new dataset comprising data from an event camera, an RGB-D camera, an IMU, a LiDAR, and joint angle encoders, all mounted on a quadruped robotic platform. This dataset features more than 100 sequences from real-world environments, encompassing various indoor and outdoor environments, different lighting conditions, a range of robot gaits (e.g., trotting, bounding, pronking), as well as acrobatic movements such as backflipping. To our knowledge, this is the first event camera dataset to include multi-sensory data collected by an agile quadruped robot.
翻译:当腿式机器人执行敏捷运动时,传统RGB相机常产生模糊图像,给精确状态估计带来挑战。受生物视觉机制启发的事件相机,凭借低延迟、高时间分辨率和高动态范围等显著优势,已成为捕捉高速运动并应对复杂光照条件的前沿解决方案。然而,事件相机在敏捷腿式机器人中的集成应用仍鲜有探索,目前尚未有针对动态腿式机器人的事件相机数据集。为弥补这一空白,我们提出EAGLE(敏捷腿式机器人事件数据集),该数据集包含安装在四足机器人平台上的事件相机、RGB-D相机、惯性测量单元、激光雷达及关节角度编码器采集的同步数据。数据集涵盖真实环境中100余个序列,包含室内外多种场景、不同光照条件、多种步态模式(如小跑、跳跃、蹬踏)以及后空翻等杂技动作。据我们所知,这是首个包含敏捷四足机器人多传感器数据的事件相机数据集。