Continual learning and machine unlearning are crucial challenges in machine learning, typically addressed separately. Continual learning focuses on adapting to new knowledge while preserving past information, whereas unlearning involves selectively forgetting specific subsets of data. In this paper, we introduce a novel framework that jointly tackles both tasks by leveraging controlled knowledge distillation. Our approach enables efficient learning with minimal forgetting and effective targeted unlearning. By incorporating a fixed memory buffer, the system supports learning new concepts while retaining prior knowledge. The distillation process is carefully managed to ensure a balance between acquiring new information and forgetting specific data as needed. Experimental results on benchmark datasets show that our method matches or exceeds the performance of existing approaches in both continual learning and machine unlearning. This unified framework is the first to address both challenges simultaneously, paving the way for adaptable models capable of dynamic learning and forgetting while maintaining strong overall performance.
翻译:持续学习与机器遗忘是机器学习领域的关键挑战,通常被分别处理。持续学习侧重于在保留历史信息的同时适应新知识,而遗忘则涉及选择性地抹除特定数据子集。本文提出一种新颖框架,通过利用受控知识蒸馏来联合处理这两项任务。我们的方法能够以最小遗忘实现高效学习,并完成有效的定向遗忘。通过引入固定记忆缓冲区,该系统支持在学习新概念的同时保留先验知识。蒸馏过程经过精细调控,以确保在获取新信息与按需遗忘特定数据之间取得平衡。在基准数据集上的实验结果表明,该方法在持续学习和机器遗忘任务上的性能均达到或超越了现有方法。该统一框架首次实现了对两大挑战的同步处理,为构建具备动态学习与遗忘能力、同时保持强大整体性能的适应性模型开辟了道路。