Ensuring that the outputs of neural networks satisfy specific constraints is crucial for applying neural networks to real-life decision-making problems. In this paper, we consider making a batch of neural network outputs satisfy bounded and general linear constraints. We first reformulate the neural network output projection problem as an entropy-regularized linear programming problem. We show that such a problem can be equivalently transformed into an unconstrained convex optimization problem with Lipschitz continuous gradient according to the duality theorem. Then, based on an accelerated gradient descent algorithm with numerical performance enhancement, we present our architecture, GLinSAT, to solve the problem. To the best of our knowledge, this is the first general linear satisfiability layer in which all the operations are differentiable and matrix-factorization-free. Despite the fact that we can explicitly perform backpropagation based on automatic differentiation mechanism, we also provide an alternative approach in GLinSAT to calculate the derivatives based on implicit differentiation of the optimality condition. Experimental results on constrained traveling salesman problems, partial graph matching with outliers, predictive portfolio allocation and power system unit commitment demonstrate the advantages of GLinSAT over existing satisfiability layers. Our implementation is available at \url{https://github.com/HunterTracer/GLinSAT}.
翻译:确保神经网络输出满足特定约束对于将神经网络应用于现实决策问题至关重要。本文研究如何使一批神经网络输出满足有界通用线性约束。我们首先将神经网络输出投影问题重构为熵正则化线性规划问题。根据对偶定理,我们证明该问题可等价转化为具有Lipschitz连续梯度的无约束凸优化问题。随后,基于数值性能增强的加速梯度下降算法,我们提出了解决该问题的GLinSAT架构。据我们所知,这是首个所有操作均可微分且无需矩阵分解的通用线性可满足性层。尽管我们可以基于自动微分机制显式执行反向传播,但GLinSAT还提供了基于最优条件隐式微分的替代导数计算方法。在约束旅行商问题、含异常值的部分图匹配、预测性投资组合配置和电力系统机组组合等任务上的实验结果表明,GLinSAT相较于现有可满足性层具有显著优势。我们的实现代码发布于\url{https://github.com/HunterTracer/GLinSAT}。