Temporal Knowledge Graph (TKG) reasoning seeks to predict future missing facts from historical evidence. While diffusion models (DM) have recently gained attention for their ability to capture complex predictive distributions, two gaps remain: (i) the generative path is conditioned only on positive evidence, overlooking informative negative context, and (ii) training objectives are dominated by cross-entropy ranking, which improves candidate ordering but provides little supervision over the calibration of the denoised embedding. To bridge this gap, we introduce Negative-Aware Diffusion model for TKG Extrapolation (NADEx). Specifically, NADEx encodes subject-centric histories of entities, relations and temporal intervals into sequential embeddings. NADEx perturbs the query object in the forward process and reconstructs it in reverse with a Transformer denoiser conditioned on the temporal-relational context. We further derive a cosine-alignment regularizer derived from batch-wise negative prototypes, which tightens the decision boundary against implausible candidates. Comprehensive experiments on four public TKG benchmarks demonstrate that NADEx delivers state-of-the-art performance.
翻译:时序知识图谱(TKG)推理旨在从历史证据中预测未来缺失的事实。尽管扩散模型(DM)因其捕捉复杂预测分布的能力近来受到关注,但仍存在两个不足:(i)生成路径仅以正向证据为条件,忽略了信息丰富的负向上下文;(ii)训练目标主要由交叉熵排序主导,这虽然改进了候选排序,但对去噪嵌入的校准几乎没有提供监督。为弥补这些不足,我们提出了用于TKG外推的负感知扩散模型(NADEx)。具体而言,NADEx将实体、关系和时间间隔的以主体为中心的历史编码为序列嵌入。NADEx在前向过程中对查询对象进行扰动,并在反向过程中使用以时序-关系上下文为条件的Transformer去噪器对其进行重构。我们进一步从批处理负向原型中推导出一种余弦对齐正则化器,以收紧决策边界,排除不合理候选。在四个公开TKG基准上的综合实验表明,NADEx实现了最先进的性能。