Certifying nonnegativity of polynomials is a well-known NP-hard problem with direct applications spanning non-convex optimization, control, robotics, and beyond. A sufficient condition for nonnegativity is the Sum of Squares (SOS) property, i.e., it can be written as a sum of squares of other polynomials. In practice, however, certifying the SOS criterion remains computationally expensive and often involves solving a Semidefinite Program (SDP), whose dimensionality grows quadratically in the size of the monomial basis of the SOS expression; hence, various methods to reduce the size of the monomial basis have been proposed. In this work, we introduce the first learning-augmented algorithm to certify the SOS criterion. To this end, we train a Transformer model that predicts an almost-minimal monomial basis for a given polynomial, thereby drastically reducing the size of the corresponding SDP. Our overall methodology comprises three key components: efficient training dataset generation of over 100 million SOS polynomials, design and training of the corresponding Transformer architecture, and a systematic fallback mechanism to ensure correct termination, which we analyze theoretically. We validate our approach on over 200 benchmark datasets, achieving speedups of over $100\times$ compared to state-of-the-art solvers and enabling the solution of instances where competing approaches fail. Our findings provide novel insights towards transforming the practical scalability of SOS programming.
翻译:验证多项式的非负性是一个众所周知的NP难问题,在非凸优化、控制、机器人学等领域具有直接应用。非负性的一个充分条件是平方和(SOS)性质,即多项式可表示为其他多项式的平方和。然而在实践中,验证SOS准则的计算成本仍然很高,通常需要求解半定规划(SDP),其维度随SOS表达式中单项式基的规模呈二次增长;因此,学界已提出多种缩减单项式基规模的方法。本工作首次提出一种学习增强算法来验证SOS准则。为此,我们训练了一个Transformer模型,该模型能预测给定多项式的近似最小单项式基,从而显著缩减对应SDP的规模。我们的整体方法包含三个关键组成部分:高效生成包含超过1亿个SOS多项式的训练数据集、相应Transformer架构的设计与训练,以及理论上可保证正确终止的系统性回退机制。我们在超过200个基准数据集上验证了该方法,相比最先进的求解器实现了超过$100\times$的加速,并能解决现有方法无法处理的实例。我们的研究结果为提升SOS规划的实际可扩展性提供了新的思路。