Symbolic regression seeks to uncover physical laws from experimental data by searching for closed-form expressions, which is an important task in AI-driven scientific discovery. Yet the exponential growth of the search space of expression renders the task computationally challenging. A promising yet underexplored direction for reducing the search space and accelerating training lies in *symbolic equivalence*: many expressions, although syntactically different, define the same function -- for example, $\log(x_1^2x_2^3)$, $\log(x_1^2)+\log(x_2^3)$, and $2\log(x_1)+3\log(x_2)$. Existing algorithms treat such variants as distinct outputs, leading to redundant exploration and slow learning. We introduce EGG-SR, a unified framework that integrates symbolic equivalence into a class of modern symbolic regression methods, including Monte Carlo Tree Search (MCTS), Deep Reinforcement Learning (DRL), and Large Language Models (LLMs). EGG-SR compactly represents equivalent expressions through the proposed EGG module (via equality graphs), accelerating learning by: (1) pruning redundant subtree exploration in EGG-MCTS, (2) aggregating rewards across equivalent generated sequences in EGG-DRL, and (3) enriching feedback prompts in EGG-LLM. Theoretically, we show the benefit of embedding EGG into learning: it tightens the regret bound of MCTS and reduces the variance of the DRL gradient estimator. Empirically, EGG-SR consistently enhances a class of symbolic regression models across several benchmarks, discovering more accurate expressions within the same time limit. Project page is at: https://nan-jiang-group.github.io/egg-sr.
翻译:符号回归旨在通过搜索闭式表达式从实验数据中揭示物理定律,这是人工智能驱动科学发现中的一项重要任务。然而,表达式搜索空间的指数级增长使得该任务在计算上极具挑战性。一个有望减少搜索空间并加速训练但尚未被充分探索的方向在于*符号等价性*:许多表达式虽然在句法上不同,却定义了相同的函数——例如,$\log(x_1^2x_2^3)$、$\log(x_1^2)+\log(x_2^3)$ 和 $2\log(x_1)+3\log(x_2)$。现有算法将这些变体视为不同的输出,导致冗余探索和学习缓慢。我们提出了EGG-SR,这是一个统一的框架,它将符号等价性整合到一类现代符号回归方法中,包括蒙特卡洛树搜索(MCTS)、深度强化学习(DRL)和大语言模型(LLMs)。EGG-SR通过提出的EGG模块(通过等价图)紧凑地表示等价表达式,并通过以下方式加速学习:(1) 在EGG-MCTS中剪枝冗余子树探索,(2) 在EGG-DRL中聚合等价生成序列的奖励,以及 (3) 在EGG-LLM中丰富反馈提示。理论上,我们展示了嵌入EGG对学习的好处:它收紧MCTS的遗憾界并降低DRL梯度估计器的方差。实证上,EGG-SR在多个基准测试中持续提升了一类符号回归模型的性能,在相同时间限制内发现了更准确的表达式。项目页面位于:https://nan-jiang-group.github.io/egg-sr。