Concept drift, temporal dependence, and catastrophic forgetting represent major challenges when learning from data streams. While Streaming Machine Learning and Continual Learning (CL) address these issues separately, recent efforts in Streaming Continual Learning (SCL) aim to unify them. In this work, we introduce MAGIC Net, a novel SCL approach that integrates CL-inspired architectural strategies with recurrent neural networks to tame temporal dependence. MAGIC Net continuously learns, looks back at past knowledge by applying learnable masks over frozen weights, and expands its architecture when necessary. It performs all operations online, ensuring inference availability at all times. Experiments on synthetic and real-world streams show that it improves adaptation to new concepts, limits memory usage, and mitigates forgetting.
翻译:概念漂移、时序依赖与灾难性遗忘是数据流学习中的主要挑战。尽管流式机器学习与持续学习(CL)分别应对这些问题,但近期流式持续学习(SCL)的研究致力于将二者统一。本文提出MAGIC Net——一种新颖的SCL方法,其融合了受CL启发的架构策略与循环神经网络以驯服时序依赖。MAGIC Net持续学习,通过冻结权重上应用可学习掩码回顾过往知识,并在必要时扩展其架构。该方法所有操作均在线上执行,确保推理能力始终可用。在合成与真实数据流上的实验表明,该方法能提升对新概念的适应能力,限制内存使用,并有效缓解遗忘。