Dynamic interactions between entities are prevalent in domains like social platforms, financial systems, healthcare, and e-commerce. These interactions can be effectively represented as time-evolving graphs, where predicting future connections is a key task in applications such as recommendation systems. Temporal Graph Neural Networks (TGNNs) have achieved strong results for such predictive tasks but typically require extensive training data, which is often limited in real-world scenarios. One approach to mitigating data scarcity is leveraging pre-trained models from related datasets. However, direct knowledge transfer between TGNNs is challenging due to their reliance on node-specific memory structures, making them inherently difficult to adapt across datasets. To address this, we introduce a novel transfer approach that disentangles node representations from their associated features through a structured bipartite encoding mechanism. This decoupling enables more effective transfer of memory components and other learned inductive patterns from one dataset to another. Empirical evaluations on real-world benchmarks demonstrate that our method significantly enhances TGNN performance in low-data regimes, outperforming non-transfer baselines by up to 56\% and surpassing existing transfer strategies by 36\%
翻译:实体间的动态交互在社交平台、金融系统、医疗保健和电子商务等领域普遍存在。这些交互可以有效地表示为时间演化图,其中预测未来连接是推荐系统等应用中的关键任务。时序图神经网络(TGNNs)在此类预测任务中取得了优异成果,但通常需要大量训练数据,而这在现实场景中往往受限。缓解数据稀缺的一种方法是利用相关数据集的预训练模型。然而,由于TGNNs依赖于节点特定的记忆结构,使其本质上难以跨数据集适应,因此TGNNs间的直接知识迁移具有挑战性。为解决这一问题,我们提出了一种新颖的迁移方法,通过结构化的二分编码机制将节点表示与其关联特征解耦。这种解耦使得记忆组件和其他学习到的归纳模式能够更有效地从一个数据集迁移到另一个数据集。在真实世界基准测试上的实证评估表明,我们的方法在低数据条件下显著提升了TGNN的性能,较非迁移基线提升高达56%,并超越现有迁移策略36%。