Finding reliable matches is essential in multi-object tracking to ensure the accuracy and reliability of perception systems in safety-critical applications such as autonomous vehicles. Effective matching mitigates perception errors, enhancing object identification and tracking for improved performance and safety. However, traditional metrics such as Intersection over Union (IoU) and Center Point Distances (CPDs), which are effective in 2D image planes, often fail to find critical matches in complex 3D scenes. To address this limitation, we introduce Contour Errors (CEs), an ego or object-centric metric for identifying matches of interest in tracking scenarios from a functional perspective. By comparing bounding boxes in the ego vehicle's frame, contour errors provide a more functionally relevant assessment of object matches. Extensive experiments on the nuScenes dataset demonstrate that contour errors improve the reliability of matches over the state-of-the-art 2D IoU and CPD metrics in tracking-by-detection methods. In 3D car tracking, our results show that Contour Errors reduce functional failures (FPs/FNs) by 80% at close ranges and 60% at far ranges compared to IoU in the evaluation stage.
翻译:在多目标跟踪中寻找可靠匹配对于确保自动驾驶汽车等安全关键应用中感知系统的准确性与可靠性至关重要。有效的匹配能够减轻感知误差,增强目标识别与跟踪能力,从而提升系统性能与安全性。然而,传统度量指标如交并比(IoU)和中心点距离(CPD)虽然在二维图像平面上表现良好,但在复杂三维场景中往往难以发现关键匹配。为克服这一局限,我们提出轮廓误差(CEs)——一种从功能视角出发、以自我或目标为中心的度量方法,用于识别跟踪场景中值得关注的匹配。通过在自车坐标系中比较边界框,轮廓误差能够提供更具功能相关性的目标匹配评估。在nuScenes数据集上的大量实验表明,在基于检测的跟踪方法中,轮廓误差相比最先进的二维IoU和CPD度量显著提升了匹配可靠性。在三维车辆跟踪任务中,我们的结果显示:在评估阶段,与IoU相比,轮廓误差在近距离将功能失效(假阳性/假阴性)减少了80%,在远距离减少了60%。