Probabilistic bits (p-bits) have recently been employed in neural networks (NNs) as stochastic neurons with sigmoidal probabilistic activation functions. Nonetheless, there remain a wealth of other probabilistic activation functions that are yet to be explored. Here we re-engineer the p-bit by decoupling its stochastic signal path from its input data path, giving rise to a modular p-bit that enables the realization of probabilistic neurons (p-neurons) with a range of configurable probabilistic activation functions, including a probabilistic version of the widely used Logistic Sigmoid, Tanh and Rectified Linear Unit (ReLU) activation functions. We present spintronic (CMOS + sMTJ) designs that show wide and tunable probabilistic ranges of operation. Finally, we experimentally implement digital-CMOS versions on an FPGA, with stochastic unit sharing, and demonstrate an order of magnitude (10x) saving in required hardware resources compared to conventional digital p-bit implementations.
翻译:概率比特(p比特)最近被用作具有S形概率激活函数的随机神经元应用于神经网络中。然而,仍存在大量尚未被探索的其他概率激活函数。本文通过将p比特的随机信号路径与其输入数据路径解耦,重新设计了p比特,从而产生了一种模块化p比特。该设计使得能够实现具有一系列可配置概率激活函数的概率神经元(p神经元),包括广泛使用的Logistic Sigmoid、Tanh和整流线性单元(ReLU)激活函数的概率版本。我们提出了自旋电子(CMOS + sMTJ)设计方案,其展现出宽泛且可调的概率工作范围。最后,我们在FPGA上实验实现了采用随机单元共享的数字CMOS版本,并证明相较于传统的数字p比特实现,所需硬件资源节省了一个数量级(10倍)。