Graph Neural Networks (GNNs) have demonstrated effectiveness in collaborative filtering tasks due to their ability to extract powerful structural features. However, combining the graph features extracted from user-item interactions and auxiliary features extracted from user genres and item properties remains a challenge. Currently available fusion methods face two major issues: 1) simple methods such as concatenation and summation are generic, but not accurate in capturing feature relationships; 2) task-specific methods like attention mechanisms and meta paths may not be suitable for general feature fusion. To address these challenges, we present GraphTransfer, a simple but universal feature fusion framework for GNN-based collaborative filtering. Our method accurately fuses different types of features by first extracting graph features from the user-item interaction graph and auxiliary features from users and items using GCN. The proposed cross fusion module then effectively bridges the semantic gaps between the interaction scores of different features. Theoretical analysis and experiments on public datasets show that GraphTransfer outperforms other feature fusion methods in CF tasks. Additionally, we demonstrate the universality of our framework via empirical studies in three other scenarios, showing that GraphTransfer leads to significant improvements in the performance of CF algorithms.
翻译:图神经网络(GNN)因其提取强大结构特征的能力,在协同过滤任务中已展现出显著效果。然而,如何有效融合从用户-物品交互图中提取的图特征与从用户类型和物品属性中提取的辅助特征,仍是一个挑战。现有融合方法主要面临两大问题:1)如拼接和求和等简单方法虽通用,但在捕捉特征间关系时不够精确;2)如注意力机制和元路径等任务特定方法可能不适用于通用的特征融合。为应对这些挑战,我们提出了GraphTransfer——一种简单而通用的、基于GNN的协同过滤特征融合框架。该方法首先通过图卷积网络(GCN)从用户-物品交互图中提取图特征,并从用户和物品中提取辅助特征,进而通过所提出的交叉融合模块有效弥合不同特征交互分数间的语义鸿沟。理论分析及在公开数据集上的实验表明,GraphTransfer在协同过滤任务中优于其他特征融合方法。此外,我们通过在其他三种场景下的实证研究证明了本框架的通用性,表明GraphTransfer能显著提升协同过滤算法的性能。