Knowing the locations of nearby moving objects is important for a mobile robot to operate safely in a dynamic environment. Dynamic object tracking performance can be improved if robots share observations of tracked objects with nearby team members in real-time. To share observations, a robot must make up-to-date estimates of the transformation from its coordinate frame to the frame of each neighbor, which can be challenging because of odometry drift. We present Multiple Object Tracking with Localization Error Elimination (MOTLEE), a complete system for a multi-robot team to accurately estimate frame transformations and collaboratively track dynamic objects. To accomplish this, robots use open-set image-segmentation methods to build object maps of their environment and then use our Temporally Consistent Alignment of Frames Filter (TCAFF) to align maps and estimate coordinate frame transformations without any initial knowledge of neighboring robot poses. We show that our method for aligning frames enables a team of four robots to collaboratively track six pedestrians with accuracy similar to that of a system with ground truth localization in a challenging hardware demonstration. The code and hardware dataset are available at https://github.com/mit-acl/motlee.
翻译:准确感知周围运动物体的位置对于移动机器人在动态环境中安全运行至关重要。若机器人能够与附近团队成员实时共享对跟踪目标的观测信息,则可提升动态目标跟踪性能。为实现观测共享,机器人需实时估计自身坐标系与每个邻机坐标系之间的变换关系,而里程计漂移使得这一过程充满挑战。本文提出"基于定位误差消除的多目标跟踪系统(MOTLEE)"——一套完整的多机器人协同框架,可精确估计坐标系变换并协作跟踪动态目标。具体而言,机器人采用开放集图像分割方法构建环境目标地图,进而利用我们提出的"时序一致性帧对齐滤波器(TCAFF)"实现地图对齐与坐标系变换估计,无需任何邻机位姿先验信息。实验证明,本文的帧对齐方法使四台机器人组成的团队能在极具挑战性的硬件演示场景中,以接近地面真值定位系统的精度实现六名行人的协作跟踪。相关代码与硬件数据集已开源至https://github.com/mit-acl/motlee。