Temporal knowledge graph (TKG) forecasting requires predicting future facts by jointly modeling structural dependencies within each snapshot and temporal evolution across snapshots. However, most existing methods are stateless: they recompute entity representations at each timestamp from a limited query window, leading to episodic amnesia and rapid decay of long-term dependencies. To address this limitation, we propose Entity State Tuning (EST), an encoder-agnostic framework that endows TKG forecasters with persistent and continuously evolving entity states. EST maintains a global state buffer and progressively aligns structural evidence with sequential signals via a closed-loop design. Specifically, a topology-aware state perceiver first injects entity-state priors into structural encoding. Then, a unified temporal context module aggregates the state-enhanced events with a pluggable sequence backbone. Subsequently, a dual-track evolution mechanism writes the updated context back to the global entity state memory, balancing plasticity against stability. Experiments on multiple benchmarks show that EST consistently improves diverse backbones and achieves state-of-the-art performance, highlighting the importance of state persistence for long-horizon TKG forecasting. The code is published at https://github.com/yuanwuyuan9/Evolving-Beyond-Snapshots
翻译:时序知识图谱预测需要通过联合建模每个快照内的结构依赖关系与跨快照的时间演化来预测未来事实。然而,现有方法大多是无状态的:它们从有限的查询窗口中为每个时间戳重新计算实体表示,导致片段性遗忘与长期依赖关系的快速衰减。为解决这一局限,我们提出实体状态调优,这是一种编码器无关的框架,为时序知识图谱预测模型赋予持久且持续演化的实体状态。该框架维护一个全局状态缓冲区,并通过闭环设计逐步将结构证据与序列信号对齐。具体而言,拓扑感知的状态感知器首先将实体状态先验注入结构编码中;随后,统一的时序上下文模块通过可插拔的序列主干聚合状态增强的事件;接着,双轨演化机制将更新后的上下文写回全局实体状态存储器,在可塑性与稳定性之间取得平衡。在多个基准测试上的实验表明,该框架能持续提升不同主干模型的性能并达到最先进的预测效果,凸显了状态持久性对于长周期时序知识图谱预测的重要性。代码已发布于 https://github.com/yuanwuyuan9/Evolving-Beyond-Snapshots