In this paper, we revisited the role of data augmentation in contrastive learning for sequential recommendation, revealing its inherent bias against low-frequency items and sparse user behaviors. To address this limitation, we proposed FACL, a frequency-aware adaptive contrastive learning framework that introduces micro-level adaptive perturbation to protect the integrity of rare items, as well as macro-level reweighting to amplify the influence of sparse and rare-interaction sequences during training. Comprehensive experiments on five public benchmark datasets demonstrated that FACL consistently outperforms state-of-the-art data augmentation and model augmentation-based methods, achieving up to 3.8% improvement in recommendation accuracy. Moreover, fine-grained analyses confirm that FACL significantly alleviates the performance drop on low-frequency items and users, highlighting its robust intent-preserving ability and its superior applicability to real-world, long-tail recommendation scenarios.
翻译:本文重新审视了数据增强在序列推荐对比学习中的作用,揭示了其对于低频项目和稀疏用户行为的内在偏见。为克服这一局限,我们提出了FACL——一种频率感知的自适应对比学习框架,该框架引入了微观层面的自适应扰动以保护稀有项目的完整性,以及宏观层面的重加权机制以增强训练过程中稀疏和稀有交互序列的影响力。在五个公开基准数据集上的综合实验表明,FACL始终优于基于数据增强和模型增强的最先进方法,在推荐准确率上最高提升达3.8%。此外,细粒度分析证实FACL显著缓解了低频项目与用户上的性能下降,突显了其强大的意图保持能力以及对现实世界长尾推荐场景的卓越适用性。