Deep neural networks (DNNs) have achieved great breakthroughs in many fields such as image classification and natural language processing. However, the execution of DNNs needs to conduct massive numbers of multiply-accumulate (MAC) operations on hardware and thus incurs a large power consumption. To address this challenge, we propose a novel digital MAC design based on encoding. In this new design, the multipliers are replaced by simple logic gates to represent the results with a wide bit representation. The outputs of the new multipliers are added by bit-wise weighted accumulation and the accumulation results are compatible with existing computing platforms accelerating neural networks. Since the multiplication function is replaced by a simple logic representation, the critical paths in the resulting circuits become much shorter. Correspondingly, pipelining stages and intermediate registers used to store partial sums in the MAC array can be reduced, leading to a significantly smaller area as well as better power efficiency. The proposed design has been synthesized and verified by ResNet18- Cifar10, ResNet20-Cifar100, ResNet50-ImageNet, MobileNetV2-Cifar10, MobileNetV2-Cifar100, and EfficientNetB0-ImageNet. The experimental results confirmed the reduction of circuit area by up to 48.79% and the reduction of power consumption of executing DNNs by up to 64.41%, while the accuracy of the neural networks can still be well maintained. The open source code of this work can be found on GitHub with link https://github.com/Bo-Liu-TUM/EncodingNet/.
翻译:深度神经网络(DNNs)在图像分类和自然语言处理等诸多领域取得了重大突破。然而,DNNs的执行需要在硬件上执行海量的乘累加(MAC)操作,从而导致巨大的功耗。为应对这一挑战,我们提出了一种基于编码的新型数字MAC设计。在这一新设计中,乘法器被简单的逻辑门所取代,以宽位表示形式呈现结果。新乘法器的输出通过按位加权累加进行相加,且累加结果与现有加速神经网络的计算平台兼容。由于乘法功能被简单的逻辑表示所替代,所得电路中的关键路径变得短得多。相应地,MAC阵列中用于存储部分和的流水线级数和中间寄存器得以减少,从而显著缩小了面积并提高了能效。所提出的设计已通过ResNet18-Cifar10、ResNet20-Cifar100、ResNet50-ImageNet、MobileNetV2-Cifar10、MobileNetV2-Cifar100以及EfficientNetB0-ImageNet进行了综合与验证。实验结果表明,在神经网络精度仍能得到良好保持的前提下,电路面积最多可减少48.79%,执行DNNs的功耗最多可降低64.41%。本工作的开源代码可在GitHub上找到,链接为 https://github.com/Bo-Liu-TUM/EncodingNet/。