Scaling laws describing the dependence of neural network performance on the amount of training data, the spent compute, and the network size have emerged across a huge variety of machine learning task and datasets. In this work, we systematically investigate these scaling laws in the context of amplitude surrogates for particle physics. We show that the scaling coefficients are connected to the number of external particles of the process. Our results demonstrate that scaling laws are a useful tool to achieve desired precision targets.
翻译:描述神经网络性能随训练数据量、计算开销和网络规模变化的标度律,已在各种机器学习任务和数据集中广泛出现。本工作系统研究了这些标度律在粒子物理学振幅代理模型中的应用。我们证明标度系数与过程的外部粒子数相关。研究结果表明,标度律是实现预期精度目标的有效工具。