We introduce an output layer for neural networks that ensures satisfaction of convex constraints. Our approach, $Π$net, leverages operator splitting for rapid and reliable projections in the forward pass, and the implicit function theorem for backpropagation. We deploy $Π$net as a feasible-by-design optimization proxy for parametric constrained optimization problems and obtain modest-accuracy solutions faster than traditional solvers when solving a single problem, and significantly faster for a batch of problems. We surpass state-of-the-art learning approaches by orders of magnitude in terms of training time, solution quality, and robustness to hyperparameter tuning, while maintaining similar inference times. Finally, we tackle multi-vehicle motion planning with non-convex trajectory preferences and provide $Π$net as a GPU-ready package implemented in JAX.
翻译:本文提出一种确保神经网络输出满足凸约束条件的输出层。我们的方法$Π$net在正向传播中利用算子分裂实现快速可靠的投影,并通过隐函数定理进行反向传播。我们将$Π$net部署为参数化约束优化问题的可行设计优化代理,在求解单个问题时获得比传统求解器更快的适度精度解,在批量求解时速度显著提升。在训练时间、解的质量以及对超参数调优的鲁棒性方面,我们以数量级优势超越现有学习方法,同时保持相近的推理时间。最后,我们处理具有非凸轨迹偏好的多车辆运动规划问题,并提供基于JAX实现的GPU就绪$Π$net软件包。