Class-incremental learning (CIL) thrives due to its success in processing the influx of information by learning from continuously added new classes while preventing catastrophic forgetting about the old ones. It is essential for the performance breakthrough of CIL to effectively refine past knowledge from the base model and balance it with new learning. However, such an issue has not yet been considered in current research. In this work, we explore the potential of CIL from these perspectives and propose a novel balanced residual distillation framework (BRD-CIL) to push the performance bar of CIL to a new higher level. Specifically, BRD-CIL designs a residual distillation learning strategy, which can dynamically expand the network structure to capture the residuals between the base and target models, effectively refining the past knowledge. Furthermore, BRD-CIL designs a balanced pseudo-label learning strategy by generating a guidance mask to reduce the preference for old classes, ensuring balanced learning from new and old classes. We apply the proposed BRD-CIL to a challenging 3D point cloud semantic segmentation task where the data are unordered and unstructured. Extensive experimental results demonstrate that BRD-CIL sets a new benchmark with an outstanding balance capability in class-biased scenarios.
翻译:类增量学习(CIL)因其能够通过持续学习新增类别来处理信息流入,同时防止对旧类别的灾难性遗忘而蓬勃发展。有效提炼基础模型中的过往知识,并将其与新学习内容相平衡,是CIL实现性能突破的关键。然而,当前研究尚未充分考虑此问题。本文从这些视角出发探索CIL的潜力,并提出一种新颖的平衡残差蒸馏框架(BRD-CIL),以将CIL的性能边界推向更高水平。具体而言,BRD-CIL设计了一种残差蒸馏学习策略,能够动态扩展网络结构以捕捉基础模型与目标模型之间的残差,从而有效提炼过往知识。此外,BRD-CIL通过生成引导掩码设计了一种平衡伪标签学习策略,以减少对旧类别的偏好,确保新旧类别的平衡学习。我们将所提出的BRD-CIL应用于具有挑战性的三维点云语义分割任务中,该任务的数据具有无序和非结构化的特点。大量实验结果表明,BRD-CIL在类别偏差场景下以出色的平衡能力设立了新的性能基准。