Graph data often exhibits complex geometric heterogeneity, where structures with varying local curvature, such as tree-like hierarchies and dense communities, coexist within a single network. Existing geometric GNNs, which embed graphs into single fixed-curvature manifolds or discrete product spaces, struggle to capture this diversity. We introduce Adaptive Riemannian Graph Neural Networks (ARGNN), a novel framework that learns a continuous and anisotropic Riemannian metric tensor field over the graph. It allows each node to determine its optimal local geometry, enabling the model to fluidly adapt to the graph's structural landscape. Our core innovation is an efficient parameterization of the node-wise metric tensor, specializing to a learnable diagonal form that captures directional geometric information while maintaining computational tractability. To ensure geometric regularity and stable training, we integrate a Ricci flow-inspired regularization that smooths the learned manifold. Theoretically, we establish the rigorous geometric evolution convergence guarantee for ARGNN and provide a continuous generalization that unifies prior fixed or mixed-curvature GNNs. Empirically, our method demonstrates superior performance on both homophilic and heterophilic benchmark datasets with the ability to capture diverse structures adaptively. Moreover, the learned geometries both offer interpretable insights into the underlying graph structure and empirically corroborate our theoretical analysis.
翻译:图数据通常展现出复杂的几何异质性,其中具有不同局部曲率的结构(如树状层次结构和密集社区)共存于单一网络内。现有的几何图神经网络(GNN)将图嵌入到单一固定曲率流形或离散乘积空间中,难以捕捉这种多样性。本文提出自适应黎曼图神经网络(ARGNN),这是一种新颖的框架,可在图上学习连续且各向异性的黎曼度量张量场。该框架允许每个节点确定其最优局部几何结构,使模型能够灵活适应图的结构格局。我们的核心创新在于节点级度量张量的高效参数化,具体化为可学习的对角形式,既能捕捉方向性几何信息,又能保持计算可行性。为确保几何正则性和训练稳定性,我们整合了受里奇流启发的正则化方法以平滑学习到的流形。理论上,我们为ARGNN建立了严格的几何演化收敛保证,并提出了一个连续泛化框架,统一了先前固定或混合曲率的GNN。实证结果表明,我们的方法在同质性和异质性基准数据集上均表现出优越性能,并具备自适应捕捉多样结构的能力。此外,学习到的几何结构不仅为底层图结构提供了可解释的洞见,也从经验上证实了我们的理论分析。