In pioneering work from 2019, Barcel\'o and coauthors identified logics that precisely match the expressive power of constant iteration-depth graph neural networks (GNNs) relative to properties definable in first-order logic. In this article, we give exact logical characterizations of recurrent GNNs in two scenarios: (1) in the setting with floating-point numbers and (2) with reals. For floats, the formalism matching recurrent GNNs is a rule-based modal logic with counting, while for reals we use a suitable infinitary modal logic, also with counting. These results give exact matches between logics and GNNs in the recurrent setting without relativising to a background logic in either case, but using some natural assumptions about floating-point arithmetic. Applying our characterizations, we also prove that, relative to graph properties definable in monadic second-order logic (MSO), our infinitary and rule-based logics are equally expressive. This implies that recurrent GNNs with reals and floats have the same expressive power over MSO-definable properties and shows that, for such properties, also recurrent GNNs with reals are characterized by a (finitary!) rule-based modal logic. In the general case, in contrast, the expressive power with floats is weaker than with reals. In addition to logic-oriented results, we also characterize recurrent GNNs, with both reals and floats, via distributed automata, drawing links to distributed computing models.
翻译:在2019年的开创性工作中,Barceló及其合作者识别出能够精确匹配恒定迭代深度图神经网络(GNNs)相对于一阶逻辑可定义性质的逻辑体系。本文中,我们在两种场景下给出循环GNNs的精确逻辑刻画:(1)使用浮点数的场景;(2)使用实数的场景。对于浮点数,匹配循环GNNs的形式体系是一种基于规则的带计数模态逻辑;而对于实数,我们使用一种合适的带计数无穷模态逻辑。这些结果在循环场景下实现了逻辑与GNNs的精确对应,且两种情况均未相对化于背景逻辑,但采用了关于浮点运算的一些自然假设。应用我们的刻画结果,我们还证明了相对于在单子二阶逻辑(MSO)中可定义的图性质,我们的无穷逻辑与基于规则的逻辑具有同等表达能力。这意味着使用实数与浮点数的循环GNNs在MSO可定义性质上具有相同的表达能力,并表明对于此类性质,使用实数的循环GNNs同样可由一种(有限!)基于规则的模态逻辑刻画。然而在一般情形下,浮点数的表达能力弱于实数。除逻辑导向的结果外,我们还通过分布式自动机刻画了使用实数与浮点数的循环GNNs,从而建立了与分布式计算模型的联系。