Inspired by the Lottery Ticket Hypothesis (LTH), which highlights the existence of efficient subnetworks within larger, dense networks, a high-performing Winning Subnetwork (WSN) in terms of task performance under appropriate sparsity conditions is considered for various continual learning tasks. It leverages pre-existing weights from dense networks to achieve efficient learning in Task Incremental Learning (TIL) and Task-agnostic Incremental Learning (TaIL) scenarios. In Few-Shot Class Incremental Learning (FSCIL), a variation of WSN referred to as the Soft subnetwork (SoftNet) is designed to prevent overfitting when the data samples are scarce. Furthermore, the sparse reuse of WSN weights is considered for Video Incremental Learning (VIL). The use of Fourier Subneural Operator (FSO) within WSN is considered. It enables compact encoding of videos and identifies reusable subnetworks across varying bandwidths. We have integrated FSO into different architectural frameworks for continual learning, including VIL, TIL, and FSCIL. Our comprehensive experiments demonstrate FSO's effectiveness, significantly improving task performance at various convolutional representational levels. Specifically, FSO enhances higher-layer performance in TIL and FSCIL and lower-layer performance in VIL.
翻译:受彩票假设(LTH)启发——该假设强调在更大规模密集网络中存在高效子网络——本研究针对多种持续学习任务,探讨在适当稀疏条件下具有优异任务性能的获胜子网络(WSN)。该方法利用密集网络的预训练权重,在任务增量学习(TIL)和任务无关增量学习(TaIL)场景中实现高效学习。在少样本类增量学习(FSCIL)中,我们设计了称为软子网络(SoftNet)的WSN变体,以防止数据样本稀缺时的过拟合现象。此外,本研究将WSN权重的稀疏复用机制拓展至视频增量学习(VIL)场景。通过在WSN中引入傅里叶子神经算子(FSO),实现了视频的紧凑编码,并能识别跨不同带宽的可复用子网络。我们将FSO集成到包括VIL、TIL和FSCIL在内的多种持续学习架构框架中。综合实验表明,FSO能有效提升各卷积表征层次的任务性能:在TIL和FSCIL中显著增强高层表征能力,在VIL中则显著提升低层表征性能。