We present hyper-connections, a simple yet effective method that can serve as an alternative to residual connections. This approach specifically addresses common drawbacks observed in residual connection variants, such as the seesaw effect between gradient vanishing and representation collapse. Theoretically, hyper-connections allow the network to adjust the strength of connections between features at different depths and dynamically rearrange layers. We conduct experiments focusing on the pre-training of large language models, including dense and sparse models, where hyper-connections show significant performance improvements over residual connections. Additional experiments conducted on vision tasks also demonstrate similar improvements. We anticipate that this method will be broadly applicable and beneficial across a wide range of AI problems.
翻译:我们提出超连接,这是一种简单而有效的方法,可作为残差连接的替代方案。该方法专门针对残差连接变体中常见的缺点,例如梯度消失与表示坍缩之间的跷跷板效应。理论上,超连接允许网络调整不同深度特征之间连接的强度,并动态重排层。我们进行了专注于大型语言模型预训练的实验,包括稠密和稀疏模型,其中超连接相较于残差连接显示出显著的性能提升。在视觉任务上进行的额外实验也证明了类似的改进。我们预计该方法将在广泛的人工智能问题中具有普遍适用性和益处。