In recent years, the expressive power of various neural architectures -- including graph neural networks (GNNs), transformers, and recurrent neural networks -- has been characterised using tools from logic and formal language theory. As the capabilities of basic architectures are becoming well understood, increasing attention is turning to models that combine multiple architectural paradigms. Among them particularly important, and challenging to analyse, are temporal extensions of GNNs, which integrate both spatial (graph-structure) and temporal (evolution over time) dimensions. In this paper, we initiate the study of logical characterisation of temporal GNNs by connecting them to two-dimensional product logics. We show that the expressive power of temporal GNNs depends on how graph and temporal components are combined. In particular, temporal GNNs that apply static GNNs recursively over time can capture all properties definable in the product logic of (past) propositional temporal logic PTL and the modal logic K. In contrast, architectures such as graph-and-time TGNNs and global TGNNs can only express restricted fragments of this logic, where the interaction between temporal and spatial operators is syntactically constrained. These provide us with the first results on the logical expressiveness of temporal GNNs.
翻译:近年来,各类神经架构(包括图神经网络(GNNs)、Transformer 和循环神经网络)的表达能力已通过逻辑与形式语言理论工具得到系统刻画。随着基础架构的能力逐渐被深入理解,研究焦点日益转向融合多种架构范式的模型。其中,图神经网络的时序扩展模型尤为重要且分析难度较高,这类模型同时整合了空间(图结构)与时间(随时间演化)两个维度。本文通过将时序图神经网络与二维乘积逻辑相关联,首次对其逻辑刻画展开系统性研究。我们证明,时序图神经网络的表达能力取决于图结构与时序组件的结合方式。具体而言,在时间维度上递归应用静态图神经网络的模型能够捕获(过去)命题时序逻辑 PTL 与模态逻辑 K 的乘积逻辑中所有可定义的属性。相比之下,如图与时序耦合 TGNN 和全局 TGNN 等架构仅能表达该逻辑的受限片段,其时序与空间算子间的交互受到句法层面的约束。这些成果为时序图神经网络的逻辑表达能力提供了首批理论结果。