Grokking is the phenomenon where neural networks NNs initially fit the training data and later generalize to the test data during training. In this paper, we empirically provide a frequency perspective to explain the emergence of this phenomenon in NNs. The core insight is that the networks initially learn the less salient frequency components present in the test data. We observe this phenomenon across both synthetic and real datasets, offering a novel viewpoint for elucidating the grokking phenomenon by characterizing it through the lens of frequency dynamics during the training process. Our empirical frequency-based analysis sheds new light on understanding the grokking phenomenon and its underlying mechanisms.
翻译:顿悟现象是指神经网络在训练过程中先拟合训练数据,随后才泛化到测试数据的现象。本文从实证角度提出一种频率视角来解释该现象在神经网络中的出现机制。核心洞见在于:网络首先学习测试数据中显著性较低频率分量。我们在合成数据集和真实数据集上均观测到这一现象,通过训练过程中的频率动态特征为阐释顿悟现象提供了新颖的视角。基于频率的实证分析为理解顿悟现象及其内在机制提供了新的启示。