While graph convolutional networks show great practical promises, the theoretical understanding of their generalization properties as a function of the number of samples is still in its infancy compared to the more broadly studied case of supervised fully connected neural networks. In this article, we predict the performances of a single-layer graph convolutional network (GCN) trained on data produced by attributed stochastic block models (SBMs) in the high-dimensional limit. Previously, only ridge regression on contextual-SBM (CSBM) has been considered in Shi et al. 2022; we generalize the analysis to arbitrary convex loss and regularization for the CSBM and add the analysis for another data model, the neural-prior SBM. We also study the high signal-to-noise ratio limit, detail the convergence rates of the GCN and show that, while consistent, it does not reach the Bayes-optimal rate for any of the considered cases.
翻译:尽管图卷积网络展现出巨大的应用前景,但相较于已得到广泛研究的监督式全连接神经网络,对其泛化性能随样本数量变化的理论理解仍处于起步阶段。本文预测了在高维极限下,使用属性随机块模型生成的数据训练单层图卷积网络的性能。此前,Shi等人(2022年)仅考虑了上下文随机块模型上的岭回归分析;我们将分析推广至上下文随机块模型的任意凸损失函数与正则化项,并增加了对另一数据模型——神经先验随机块模型的分析。我们还研究了高信噪比极限,详细阐述了图卷积网络的收敛速率,并证明尽管该网络具有一致性,但在所有考虑的情况下均未达到贝叶斯最优速率。