We present Hierarchical Residual Networks (HiResNets), deep convolutional neural networks with long-range residual connections between layers at different hierarchical levels. HiResNets draw inspiration on the organization of the mammalian brain by replicating the direct connections from subcortical areas to the entire cortical hierarchy. We show that the inclusion of hierarchical residuals in several architectures, including ResNets, results in a boost in accuracy and faster learning. A detailed analysis of our models reveals that they perform hierarchical compositionality by learning feature maps relative to the compressed representations provided by the skip connections.
翻译:我们提出了层次残差网络(HiResNets),这是一种具有跨层级长程残差连接的深度卷积神经网络。HiResNets的灵感来源于哺乳动物大脑的组织结构,通过复现从皮层下区域到整个皮层层级的直接连接机制。实验表明,在包括ResNets在内的多种架构中引入层次残差连接,能够显著提升模型精度并加速学习过程。对模型的深入分析表明,这些网络通过利用跳跃连接提供的压缩表征来学习特征映射,从而实现了层次化的组合性计算。