Generalized Continual Category Discovery (GCCD) tackles learning from sequentially arriving, partially labeled datasets while uncovering new categories. Traditional methods depend on feature distillation to prevent forgetting the old knowledge. However, this strategy restricts the model's ability to adapt and effectively distinguish new categories. To address this, we introduce a novel technique integrating a learnable projector with feature distillation, thus enhancing model adaptability without sacrificing past knowledge. The resulting distribution shift of the previously learned categories is mitigated with the auxiliary category adaptation network. We demonstrate that while each component offers modest benefits individually, their combination - dubbed CAMP (Category Adaptation Meets Projected distillation) - significantly improves the balance between learning new information and retaining old. CAMP exhibits superior performance across several GCCD and Class Incremental Learning scenarios. The code is available at https://github.com/grypesc/CAMP.
翻译:广义持续类别发现(GCCD)旨在从顺序到达、部分标注的数据集中学习,同时发现新类别。传统方法依赖特征蒸馏来防止遗忘旧知识,但该策略限制了模型适应并有效区分新类别的能力。为解决这一问题,我们提出一种新颖技术,将可学习投影器与特征蒸馏相结合,从而在不牺牲过往知识的前提下增强模型适应性。先前学习类别的分布偏移通过辅助类别适应网络得到缓解。我们证明,虽然每个组件单独提供有限收益,但它们的组合——称为CAMP(类别适应与投影蒸馏的融合)——显著改善了学习新信息与保留旧知识之间的平衡。CAMP在多个GCCD和类增量学习场景中均表现出卓越性能。代码发布于 https://github.com/grypesc/CAMP。