Memristors offer significant advantages as in-memory computing devices due to their non-volatility, low power consumption, and history-dependent conductivity. These attributes are particularly valuable in the realm of neuromorphic circuits for neural networks, which currently face limitations imposed by the Von Neumann architecture and high energy demands. This study evaluates the feasibility of using memristors for in-memory processing by constructing and training three digital convolutional neural networks with the datasets MNIST, CIFAR10 and CIFAR100. Subsequent conversion of these networks into memristive systems was performed using Memtorch. The simulations, conducted under ideal conditions, revealed minimal precision losses of nearly 1% during inference. Additionally, the study analyzed the impact of tile size and memristor-specific non-idealities on performance, highlighting the practical implications of integrating memristors in neuromorphic computing systems. This exploration into memristive neural network applications underscores the potential of Memtorch in advancing neuromorphic architectures.
翻译:忆阻器因其非易失性、低功耗及电导历史依赖性等特性,作为存内计算器件具有显著优势。这些特性在神经网络神经形态电路领域尤为重要,该领域目前正受限于冯·诺依曼架构与高能耗需求。本研究通过构建并训练三个分别基于MNIST、CIFAR10和CIFAR100数据集的数字卷积神经网络,评估了利用忆阻器实现存内处理的可行性。随后使用Memtorch将这些网络转换为忆阻系统。在理想条件下进行的仿真显示,推理过程中的精度损失极小(约1%)。此外,研究还分析了阵列规模与忆阻器固有非理想特性对性能的影响,揭示了忆阻器集成于神经形态计算系统的实际意义。本次对忆阻神经网络应用的探索,彰显了Memtorch在推进神经形态架构发展方面的潜力。