This book chapter delves into the dynamics of continual learning, which is the process of incrementally learning from a non-stationary stream of data. Although continual learning is a natural skill for the human brain, it is very challenging for artificial neural networks. An important reason is that, when learning something new, these networks tend to quickly and drastically forget what they had learned before, a phenomenon known as catastrophic forgetting. Especially in the last decade, continual learning has become an extensively studied topic in deep learning. This book chapter reviews the insights that this field has generated.
翻译:本书章节深入探讨了持续学习的动态机制,即从非平稳数据流中增量学习的过程。尽管持续学习是人脑的自然能力,但对人工神经网络而言却极具挑战性。一个关键原因在于,当学习新知识时,这些网络往往迅速且彻底地遗忘先前所学的内容,这种现象被称为灾难性遗忘。尤其在近十年间,持续学习已成为深度学习领域广泛研究的课题。本书章节综述了该领域取得的研究成果。