Chip design relies heavily on generating Boolean circuits, such as AND-Inverter Graphs (AIGs), from functional descriptions like truth tables. This generation operation is a key process in logic synthesis, a primary chip design stage. While recent advances in deep learning have aimed to accelerate circuit design, these efforts have mostly focused on tasks other than synthesis, and traditional heuristic methods have plateaued. In this paper, we introduce ShortCircuit, a novel transformer-based architecture that leverages the structural properties of AIGs and performs efficient space exploration. Contrary to prior approaches attempting end-to-end generation of logic circuits using deep networks, ShortCircuit employs a two-phase process combining supervised with reinforcement learning to enhance generalization to unseen truth tables. We also propose an AlphaZero variant to handle the double exponentially large state space and the reward sparsity, enabling the discovery of near-optimal designs. To evaluate the generative performance of our model , we extract 500 truth tables from a set of 20 real-world circuits. ShortCircuit successfully generates AIGs for $98\%$ of the 8-input test truth tables, and outperforms the state-of-the-art logic synthesis tool, ABC, by $18.62\%$ in terms of circuits size.
翻译:芯片设计在很大程度上依赖于从真值表等功能描述生成布尔电路(如与-非图)。该生成操作是逻辑综合(芯片设计的主要阶段)中的关键流程。尽管深度学习的最新进展旨在加速电路设计,但这些努力大多集中在综合以外的任务上,而传统启发式方法已进入平台期。本文提出ShortCircuit,一种基于Transformer的新型架构,该架构利用AIG的结构特性并执行高效的空间探索。与先前尝试使用深度网络端到端生成逻辑电路的方法不同,ShortCircuit采用结合监督学习与强化学习的双阶段流程,以增强对未见真值表的泛化能力。我们还提出一种AlphaZero变体来处理双指数级大的状态空间和稀疏奖励问题,从而能够发现接近最优的设计。为评估模型的生成性能,我们从20个真实电路集中提取了500个真值表。ShortCircuit成功为8输入测试真值表中$98\%$的案例生成了AIG,并在电路规模指标上以$18.62\%$的优势超越了最先进的逻辑综合工具ABC。