Branch-and-bound (BaB) is among the most effective methods for neural network (NN) verification. However, existing works on BaB have mostly focused on NNs with piecewise linear activations, especially ReLU networks. In this paper, we develop a general framework, named GenBaB, to conduct BaB for general nonlinearities in general computational graphs based on linear bound propagation. To decide which neuron to branch, we design a new branching heuristic which leverages linear bounds as shortcuts to efficiently estimate the potential improvement after branching. To decide nontrivial branching points for general nonlinear functions, we propose to optimize branching points offline, which can be efficiently leveraged during verification with a lookup table. We demonstrate the effectiveness of our GenBaB on verifying a wide range of NNs, including networks with activation functions such as Sigmoid, Tanh, Sine and GeLU, as well as networks involving multi-dimensional nonlinear operations such as multiplications in LSTMs and Vision Transformers. Our framework also allows the verification of general nonlinear computation graphs and enables verification applications beyond simple neural networks, particularly for AC Optimal Power Flow (ACOPF). GenBaB is part of the latest $\alpha,\!\beta$-CROWN, the winner of the 4th International Verification of Neural Networks Competition (VNN-COMP 2023).
翻译:分支定界(BaB)是神经网络(NN)验证中最有效的方法之一。然而,现有关于BaB的研究主要集中在具有分段线性激活函数的神经网络上,尤其是ReLU网络。本文提出了一种名为GenBaB的通用框架,基于线性边界传播,在通用计算图中对一般非线性函数执行BaB。为了决定对哪个神经元进行分支,我们设计了一种新的分支启发式策略,该策略利用线性边界作为捷径,高效估计分支后的潜在改进。为了确定一般非线性函数的非平凡分支点,我们提出离线优化分支点,在验证过程中可通过查找表高效利用。我们在验证多种神经网络上证明了GenBaB的有效性,包括具有Sigmoid、Tanh、Sine和GeLU等激活函数的网络,以及涉及多维非线性操作(如LSTM和Vision Transformer中的乘法)的网络。我们的框架还允许验证一般非线性计算图,并支持超越简单神经网络的验证应用,特别是交流最优潮流(ACOPF)问题。GenBaB是当前最新版本$\alpha,\!\beta$-CROWN的组成部分,该工具赢得了第四届国际神经网络验证竞赛(VNN-COMP 2023)冠军。