Current Semi-Supervised Object Detection (SSOD) methods enhance detector performance by leveraging large amounts of unlabeled data, assuming that both labeled and unlabeled data share the same label space. However, in open-set scenarios, the unlabeled dataset contains both in-distribution (ID) classes and out-of-distribution (OOD) classes. Applying semi-supervised detectors in such settings can lead to misclassifying OOD class as ID classes. To alleviate this issue, we propose a simple yet effective method, termed Collaborative Feature-Logits Detector (CFL-Detector). Specifically, we introduce a feature-level clustering method using contrastive loss to clarify vector boundaries in the feature space and highlight class differences. Additionally, by optimizing the logits-level uncertainty classification loss, the model enhances its ability to effectively distinguish between ID and OOD classes. Extensive experiments demonstrate that our method achieves state-of-the-art performance compared to existing methods.
翻译:当前的半监督目标检测方法通过利用大量未标注数据来提升检测器性能,其假设标注数据与未标注数据共享相同的标签空间。然而,在开放集场景下,未标注数据集中同时包含分布内类别与分布外类别。在此类场景中应用半监督检测器可能导致将分布外类别误判为分布内类别。为缓解此问题,我们提出了一种简单而有效的方法,称为协作特征-逻辑检测器。具体而言,我们引入了一种基于对比损失的特征级聚类方法,以明晰特征空间中的向量边界并突出类别差异。此外,通过优化逻辑级不确定性分类损失,模型增强了有效区分分布内与分布外类别的能力。大量实验表明,与现有方法相比,我们的方法取得了最先进的性能。