Chip design relies heavily on generating Boolean circuits, such as AND-Inverter Graphs (AIGs), from functional descriptions like truth tables. While recent advances in deep learning have aimed to accelerate circuit design, these efforts have mostly focused on tasks other than synthesis, and traditional heuristic methods have plateaued. In this paper, we introduce ShortCircuit, a novel transformer-based architecture that leverages the structural properties of AIGs and performs efficient space exploration. Contrary to prior approaches attempting end-to-end generation of logic circuits using deep networks, ShortCircuit employs a two-phase process combining supervised with reinforcement learning to enhance generalization to unseen truth tables. We also propose an AlphaZero variant to handle the double exponentially large state space and the sparsity of the rewards, enabling the discovery of near-optimal designs. To evaluate the generative performance of our trained model , we extract 500 truth tables from a benchmark set of 20 real-world circuits. ShortCircuit successfully generates AIGs for 84.6% of the 8-input test truth tables, and outperforms the state-of-the-art logic synthesis tool, ABC, by 14.61% in terms of circuits size.
翻译:芯片设计在很大程度上依赖于从功能描述(如真值表)生成布尔电路,例如与-非图(AIG)。尽管深度学习的最新进展旨在加速电路设计,但这些努力大多集中在除综合之外的其他任务上,而传统的启发式方法已进入平台期。在本文中,我们介绍了ShortCircuit,一种新颖的基于Transformer的架构,它利用AIG的结构特性并执行高效的空间探索。与先前尝试使用深度网络端到端生成逻辑电路的方法不同,ShortCircuit采用了一个结合监督学习与强化学习的两阶段流程,以增强对未见真值表的泛化能力。我们还提出了一种AlphaZero变体来处理双指数级大的状态空间和稀疏的奖励信号,从而能够发现接近最优的设计。为了评估我们训练模型的生成性能,我们从一组包含20个实际电路的基准集中提取了500个真值表。ShortCircuit成功为84.6%的8输入测试真值表生成了AIG,并且在电路规模方面优于最先进的逻辑综合工具ABC,性能提升达14.61%。