Scaling network depth has been a central driver behind the success of modern foundation models, yet recent investigations suggest that deep layers are often underutilized. This paper revisits the default mechanism for deepening neural networks, namely residual connections, from an optimization perspective. Rigorous analysis proves that the layout of residual connections can fundamentally shape convergence behavior, and even induces an exponential gap in convergence rates. Prompted by this insight, we introduce adaptive neural connection reassignment (ANCRe), a principled and lightweight framework that parameterizes and learns residual connectivities from the data. ANCRe adaptively reassigns residual connections with negligible computational and memory overhead ($<1\%$), while enabling more effective utilization of network depth. Extensive numerical tests across pre-training of large language models, diffusion models, and deep ResNets demonstrate consistently accelerated convergence, boosted performance, and enhanced depth efficiency over conventional residual connections.
翻译:扩展网络深度已成为现代基础模型成功的关键驱动力,然而近期研究表明深层网络往往未能得到充分利用。本文从优化视角重新审视了加深神经网络的默认机制——残差连接。严格的理论分析证明,残差连接的布局能够从根本上影响收敛行为,甚至会导致收敛速度出现指数级差距。受此启发,我们提出了自适应神经连接重分配(ANCRe),这是一个参数化并从数据中学习残差连接性的原理性轻量框架。ANCRe以可忽略的计算和内存开销(<1%)自适应地重分配残差连接,同时实现更有效的网络深度利用。在大语言模型预训练、扩散模型以及深度ResNet上的大量数值实验表明,相比传统残差连接,ANCRe能持续加速收敛、提升性能并增强深度效率。