Continual learning aims to allow models to learn new tasks without forgetting what has been learned before. This work introduces Elastic Variational Continual Learning with Weight Consolidation (EVCL), a novel hybrid model that integrates the variational posterior approximation mechanism of Variational Continual Learning (VCL) with the regularization-based parameter-protection strategy of Elastic Weight Consolidation (EWC). By combining the strengths of both methods, EVCL effectively mitigates catastrophic forgetting and enables better capture of dependencies between model parameters and task-specific data. Evaluated on five discriminative tasks, EVCL consistently outperforms existing baselines in both domain-incremental and task-incremental learning scenarios for deep discriminative models.
翻译:持续学习旨在使模型能够学习新任务而不遗忘先前所学知识。本文提出了一种新颖的混合模型——基于权重整合的弹性变分持续学习(EVCL),该模型将变分持续学习(VCL)的变分后验近似机制与弹性权重整合(EWC)基于正则化的参数保护策略相结合。通过融合两种方法的优势,EVCL有效缓解了灾难性遗忘问题,并能更好地捕捉模型参数与任务特定数据之间的依赖关系。在五个判别任务上的评估结果表明,对于深度判别模型,EVCL在领域增量学习和任务增量学习场景中均持续优于现有基线方法。