Sequential recommendation systems leveraging transformer architectures have demonstrated exceptional capabilities in capturing user behavior patterns. At the core of these systems lies the critical challenge of constructing effective item representations. Traditional approaches employ feature fusion through simple concatenation or basic neural architectures to create uniform representation sequences. However, these conventional methods fail to address the intrinsic diversity of item attributes, thereby constraining the transformer's capacity to discern fine-grained patterns and hindering model extensibility. Although recent research has begun incorporating user-related heterogeneous features into item sequences, the equally crucial item-side heterogeneous feature continue to be neglected. To bridge this methodological gap, we present HeterRec - an innovative framework featuring two novel components: the Heterogeneous Token Flattening Layer (HTFL) and Hierarchical Causal Transformer (HCT). HTFL pioneers a sophisticated tokenization mechanism that decomposes items into multi-dimensional token sets and structures them into heterogeneous sequences, enabling scalable performance enhancement through model expansion. The HCT architecture further enhances pattern discovery through token-level and item-level attention mechanisms. furthermore, we develop a Listwise Multi-step Prediction (LMP) objective function to optimize learning process. Rigorous validation, including real-world industrial platforms, confirms HeterRec's state-of-the-art performance in both effective and efficiency.
翻译:基于Transformer架构的序列推荐系统在捕捉用户行为模式方面展现出卓越能力。这些系统的核心挑战在于构建有效的物品表征。传统方法通过简单拼接或基础神经网络架构进行特征融合,以创建统一的表征序列。然而,这些传统方法未能解决物品属性的内在多样性,从而限制了Transformer识别细粒度模式的能力,并阻碍了模型的可扩展性。尽管近期研究开始将用户相关的异构特征融入物品序列,但同等重要的物品侧异构特征仍被忽视。为填补这一方法学空白,我们提出HeterRec——一个创新性框架,包含两个新颖组件:异构令牌扁平化层(HTFL)和层次化因果Transformer(HCT)。HTFL开创了一种精细的令牌化机制,将物品分解为多维令牌集并组织成异构序列,通过模型扩展实现可扩展的性能提升。HCT架构通过令牌级和物品级注意力机制进一步增强了模式发现能力。此外,我们开发了列表式多步预测(LMP)目标函数以优化学习过程。包括真实工业平台在内的严格验证证实了HeterRec在效果和效率方面的最先进性能。