Modern machine learning models are typically trained via multi-pass stochastic gradient descent (SGD) with small batch sizes, and understanding their dynamics in high dimensions is of great interest. However, an analytical framework for describing the high-dimensional asymptotic behavior of multi-pass SGD with small batch sizes for nonlinear models is currently missing. In this study, we address this gap by analyzing the high-dimensional dynamics of a stochastic differential equation called a \emph{stochastic gradient flow} (SGF), which approximates multi-pass SGD in this regime. In the limit where the number of data samples $n$ and the dimension $d$ grow proportionally, we derive a closed system of low-dimensional and continuous-time equations and prove that it characterizes the asymptotic distribution of the SGF parameters. Our theory is based on the dynamical mean-field theory (DMFT) and is applicable to a wide range of models encompassing generalized linear models and two-layer neural networks. We further show that the resulting DMFT equations recover several existing high-dimensional descriptions of SGD dynamics as special cases, thereby providing a unifying perspective on prior frameworks such as online SGD and high-dimensional linear regression. Our proof builds on the existing DMFT technique for gradient flow and extends it to handle the stochasticity in SGF using tools from stochastic calculus.
翻译:现代机器学习模型通常通过小批量多轮随机梯度下降(SGD)进行训练,理解其在高维空间中的动力学特性具有重要意义。然而,目前尚缺乏一个能够描述非线性模型在小批量多轮SGD下高维渐近行为的分析框架。本研究通过分析一种称为**随机梯度流**(SGF)的随机微分方程的高维动力学来填补这一空白,该方程在此机制下近似多轮SGD。在数据样本数$n$与维度$d$成比例增长的极限下,我们推导出一个封闭的低维连续时间方程组,并证明其刻画了SGF参数的渐近分布。我们的理论基于动力学平均场理论(DMFT),适用于涵盖广义线性模型和两层神经网络在内的广泛模型。我们进一步表明,所得的DMFT方程将多个现有SGD动力学高维描述恢复为特例,从而为在线SGD和高维线性回归等先前框架提供了统一视角。我们的证明建立在现有梯度流DMFT技术的基础上,并利用随机分析工具将其扩展至处理SGF中的随机性。