The use of neural networks in edge devices is increasing, which introduces new security challenges related to the neural networks' confidentiality. As edge devices often offer physical access, attacks targeting the hardware, such as side-channel analysis, must be considered. To enhance the performance of neural network inference, hardware accelerators are commonly employed. This work investigates the influence of parallel processing within such accelerators on correlation-based side-channel attacks that exploit power consumption. The focus is on neurons that are part of the same fully-connected layer, which run parallel and simultaneously process the same input value. The theoretical impact of concurrent multiply-and-accumulate operations on overall power consumption is evaluated, as well as the success rate of correlation power analysis. Based on the observed behavior, equations are derived that describe how the correlation decreases with increasing levels of parallelism. The applicability of these equations is validated using a vector-multiplication unit implemented on an FPGA.
翻译:随着神经网络在边缘设备中的应用日益增多,其保密性面临新的安全挑战。由于边缘设备通常允许物理接触,必须考虑针对硬件的攻击,如侧信道分析。为提高神经网络推理性能,硬件加速器被广泛采用。本研究探讨了此类加速器内部并行处理对基于功耗的相关侧信道攻击的影响。研究重点聚焦于全连接层中并行运行且同时处理相同输入值的神经元。本文评估了并发乘累加操作对整体功耗的理论影响,以及相关功耗分析的成功率。基于观测到的行为,推导出描述相关度随并行度增加而降低的方程。通过FPGA上实现的向量乘法单元验证了这些方程的适用性。