We construct and analyze a neural network two-sample test to determine whether two datasets came from the same distribution (null hypothesis) or not (alternative hypothesis). We perform time-analysis on a neural tangent kernel (NTK) two-sample test. In particular, we derive the theoretical minimum training time needed to ensure the NTK two-sample test detects a deviation-level between the datasets. Similarly, we derive the theoretical maximum training time before the NTK two-sample test detects a deviation-level. By approximating the neural network dynamics with the NTK dynamics, we extend this time-analysis to the realistic neural network two-sample test generated from time-varying training dynamics and finite training samples. A similar extension is done for the neural network two-sample test generated from time-varying training dynamics but trained on the population. To give statistical guarantees, we show that the statistical power associated with the neural network two-sample test goes to 1 as the neural network training samples and test evaluation samples go to infinity. Additionally, we prove that the training times needed to detect the same deviation-level in the null and alternative hypothesis scenarios are well-separated. Finally, we run some experiments showcasing a two-layer neural network two-sample test on a hard two-sample test problem and plot a heatmap of the statistical power of the two-sample test in relation to training time and network complexity.
翻译:我们构建并分析了一种神经网络双样本检验方法,用于判定两个数据集是否来自同一分布(零假设)或不同分布(备择假设)。我们对神经正切核(NTK)双样本检验进行了时间分析。具体而言,我们推导了确保NTK双样本检验能够检测到数据集间偏差水平所需的理论最小训练时间。类似地,我们推导了NTK双样本检验检测到偏差水平前的理论最大训练时间。通过使用NTK动态近似神经网络动态,我们将此时间分析推广至由时变训练动态和有限训练样本生成的实际神经网络双样本检验。对于由时变训练动态生成但在总体数据上训练的神经网络双样本检验,我们也进行了类似的推广分析。为提供统计保证,我们证明了当神经网络训练样本和检验评估样本趋于无穷时,与神经网络双样本检验相关的统计功效趋近于1。此外,我们证明了在零假设和备择假设场景下检测相同偏差水平所需的训练时间具有显著分离性。最后,我们在一个困难的双样本检验问题上运行了两层神经网络双样本检验实验,并绘制了检验统计功效相对于训练时间和网络复杂度的热力图。