Complex and larger networks are becoming increasingly prevalent in scientific applications in various domains. Although a number of models and methods exist for such networks, cross-validation on networks remains challenging due to the unique structure of network data. In this paper, we propose a general cross-validation procedure called NETCROP (NETwork CRoss-Validation using Overlapping Partitions). The key idea is to divide the original network into multiple subnetworks with a shared overlap part, producing training sets consisting of the subnetworks and a test set with the node pairs between the subnetworks. This train-test split provides the basis for a network cross-validation procedure that can be applied on a wide range of model selection and parameter tuning problems for networks. The method is computationally efficient for large networks as it uses smaller subnetworks for the training step. We provide methodological details and theoretical guarantees for several model selection and parameter tuning tasks using NETCROP. Numerical results demonstrate that NETCROP performs accurate cross-validation on a diverse set of network model selection and parameter tuning problems. The results also indicate that NETCROP is computationally much faster while being often more accurate than the existing methods for network cross-validation.
翻译:随着科学应用的发展,复杂且规模庞大的网络在各领域日益普遍。尽管已有多种针对此类网络的模型与方法,但由于网络数据具有独特的结构,网络数据的交叉验证仍面临挑战。本文提出了一种通用的交叉验证方法,称为NETCROP(基于重叠划分的网络交叉验证)。其核心思想是将原始网络划分为多个具有共享重叠部分的子网络,从而构建由子网络组成的训练集,以及由子网络间节点对构成的测试集。这种训练-测试划分方式为网络交叉验证提供了基础,可广泛应用于网络的模型选择与参数调优问题。该方法对大规模网络具有计算高效性,因其在训练阶段使用规模较小的子网络。我们针对使用NETCROP进行多种模型选择与参数调优任务提供了方法细节与理论保证。数值实验表明,NETCROP在多种网络模型选择与参数调优问题上均能实现精确的交叉验证。结果同时显示,与现有网络交叉验证方法相比,NETCROP在计算速度上显著更快,且通常具有更高的准确性。