Generalization and approximation capabilities of message passing graph neural networks (MPNNs) are often studied by defining a compact metric on a space of input graphs under which MPNNs are Hölder continuous. Such analyses are of two varieties: 1) when the metric space includes graphs of unbounded sizes, the theory is only appropriate for dense graphs, and, 2) when studying sparse graphs, the metric space only includes graphs of uniformly bounded size. In this work, we present a unified approach, defining a compact metric on the space of graphs of all sizes, both sparse and dense, under which MPNNs are Hölder continuous. This leads to more powerful universal approximation theorems and generalization bounds than previous works. The theory is based on, and extends, a recent approach to graph limit theory called graphop analysis.
翻译:消息传递图神经网络(MPNN)的泛化与逼近能力通常通过在输入图空间上定义紧致度量来研究,在该度量下MPNN具有赫尔德连续性。此类分析存在两种模式:1)当度量空间包含无界规模图时,理论仅适用于稠密图;2)当研究稀疏图时,度量空间仅包含一致有界规模的图。本工作提出一种统一方法,在所有规模(稀疏与稠密)的图空间上定义紧致度量,使得MPNN在该度量下具有赫尔德连续性。这产生了比以往研究更具普适性的逼近定理与泛化界。该理论基于并拓展了近期一种称为图算子分析的图极限理论方法。