Contemporary sequential recommendation methods are becoming more complex, shifting from classification to a diffusion-guided generative paradigm. However, the quality of guidance in the form of user information is often compromised by missing data in the observed sequences, leading to suboptimal generation quality. Existing methods address this by removing locally similar items, but overlook ``critical turning points'' in user interest, which are crucial for accurately predicting subsequent user intent. To address this, we propose a novel Counterfactual Attention Regulation Diffusion model (CARD), which focuses on amplifying the signal from key interest-turning-point items while concurrently identifying and suppressing noise within the user sequence. CARD consists of (1) a Dual-side Thompson Sampling method to identify sequences undergoing significant interest shift, and (2) a counterfactual attention mechanism for these sequences to quantify the importance of each item. In this manner, CARD provides the diffusion model with a high-quality guidance signal composed of dynamically re-weighted interaction vectors to enable effective generation. Experiments show our method works well on real-world data without being computationally expensive. Our code is available at https://github.com/yanqilong3321/CARD.
翻译:当代序列推荐方法正变得日益复杂,正从分类范式转向扩散引导的生成范式。然而,以用户信息形式呈现的引导质量,常因观测序列中的缺失数据而受损,导致生成质量欠佳。现有方法通过移除局部相似项来解决此问题,但忽视了用户兴趣中的“关键转折点”,而这些转折点对于准确预测后续用户意图至关重要。为解决此问题,我们提出了一种新颖的反事实注意力调控扩散模型(CARD),该模型专注于放大来自关键兴趣转折点项目的信号,同时识别并抑制用户序列内的噪声。CARD包含两个部分:(1)一种双端汤普森采样方法,用于识别经历显著兴趣转移的序列;(2)针对这些序列的反事实注意力机制,以量化每个项目的重要性。通过这种方式,CARD为扩散模型提供了一个由动态重新加权的交互向量构成的高质量引导信号,从而实现有效的生成。实验表明,我们的方法在真实世界数据上表现良好,且计算开销不高。我们的代码可在 https://github.com/yanqilong3321/CARD 获取。