This paper develops a real-time decentralized metric-semantic Simultaneous Localization and Mapping (SLAM) approach that leverages a sparse and lightweight object-based representation to enable a heterogeneous robot team to autonomously explore 3D environments featuring indoor, urban, and forested areas without relying on GPS. We use a hierarchical metric-semantic representation of the environment, including high-level sparse semantic maps of object models and low-level voxel maps. We leverage the informativeness and viewpoint invariance of the high-level semantic map to obtain an effective semantics-driven place-recognition algorithm for inter-robot loop closure detection across aerial and ground robots with different sensing modalities. A communication module is designed to track each robot's own observations and those of other robots whenever communication links are available. Such observations are then used to construct a merged map. Our framework enables real-time decentralized operations onboard robots, allowing them to opportunistically leverage communication. We integrate and deploy our proposed framework on three types of aerial and ground robots. Extensive experimental results show an average inter-robot localization error of approximately 20 cm in position and 0.2 degrees in orientation, an object mapping F1 score consistently over 0.9, and a communication packet size of merely 2-3 megabytes per kilometer trajectory with as many as 1,000 landmarks. The project website can be found at https://xurobotics.github.io/slideslam/.
翻译:本文提出了一种实时去中心化度量-语义同步定位与建图(SLAM)方法,该方法利用稀疏且轻量化的基于物体的环境表示,使异构机器人团队能够在无需依赖GPS的情况下,自主探索包含室内、城市与森林区域的复杂三维环境。我们采用分层的度量-语义环境表示,包括高层稀疏物体模型语义地图与底层体素地图。通过利用高层语义地图的信息丰富性与视角不变性,我们实现了一种高效的语义驱动地点识别算法,用于在不同感知模态的空中与地面机器人之间进行跨机器人闭环检测。我们设计了一个通信模块,用于在通信链路可用时跟踪各机器人自身的观测以及其他机器人的观测。这些观测随后被用于构建融合地图。我们的框架支持在机器人机载端实现实时去中心化运行,使其能够机会性地利用通信资源。我们将所提框架集成并部署于三种类型的空中与地面机器人平台。大量实验结果表明,该方法平均机器人间定位误差约为20厘米(位置)与0.2度(朝向),物体建图F1分数持续高于0.9,且每公里轨迹(包含多达1000个路标点)的通信数据包大小仅为2-3兆字节。项目网站详见 https://xurobotics.github.io/slideslam/。