The burgeoning demand for collaborative robotic systems to execute complex tasks collectively has intensified the research community's focus on advancing simultaneous localization and mapping (SLAM) in a cooperative context. Despite this interest, the scalability and diversity of existing datasets for collaborative trajectories remain limited, especially in scenarios with constrained perspectives where the generalization capabilities of Collaborative SLAM (C-SLAM) are critical for the feasibility of multi-agent missions. Addressing this gap, we introduce S3E, an expansive multimodal dataset. Captured by a fleet of unmanned ground vehicles traversing four distinct collaborative trajectory paradigms, S3E encompasses 13 outdoor and 5 indoor sequences. These sequences feature meticulously synchronized and spatially calibrated data streams, including 360-degree LiDAR point cloud, high-resolution stereo imagery, high-frequency inertial measurement units (IMU), and Ultra-wideband (UWB) relative observations. Our dataset not only surpasses previous efforts in scale, scene diversity, and data intricacy but also provides a thorough analysis and benchmarks for both collaborative and individual SLAM methodologies. For access to the dataset and the latest information, please visit our repository at https://pengyu-team.github.io/S3E.
翻译:随着协作机器人系统执行复杂集体任务的需求日益增长,研究界对推进协同环境下的同步定位与建图(SLAM)技术的关注度持续提升。尽管相关研究兴趣浓厚,现有面向协同轨迹的数据集在规模与多样性方面仍显不足,尤其在视角受限的场景中——此类场景下协同SLAM(C-SLAM)的泛化能力对多智能体任务的可行性至关重要。为填补这一空白,我们推出了S3E——一个大规模多模态数据集。该数据集通过一组穿越四种不同协同轨迹范式的无人地面车辆采集而成,包含13个室外序列与5个室内序列。所有序列均提供经过精密时间同步与空间校准的多模态数据流,涵盖360度激光雷达点云、高分辨率立体图像、高频惯性测量单元(IMU)以及超宽带(UWB)相对观测数据。本数据集不仅在规模、场景多样性与数据复杂度上超越先前工作,还为协同及单机SLAM方法提供了全面分析与基准测试。如需获取数据集及最新信息,请访问我们的项目页面:https://pengyu-team.github.io/S3E。