It has been shown that a neural network's Lipschitz constant can be leveraged to derive robustness guarantees, to improve generalizability via regularization or even to construct invertible networks. Therefore, a number of methods varying in the tightness of their bounds and their computational cost have been developed to approximate the Lipschitz constant for different classes of networks. However, comparatively little research exists on methods for exact computation, which has been shown to be NP-hard. Nonetheless, there are applications where one might readily accept the computational cost of an exact method. These applications could include the benchmarking of new methods or the computation of robustness guarantees for small models on sensitive data. Unfortunately, existing exact algorithms restrict themselves to only ReLU-activated networks, which are known to come with severe downsides in the context of Lipschitz-constrained networks. We therefore propose a generalization of the LipBaB algorithm to compute exact Lipschitz constants for arbitrary piecewise linear neural networks and $p$-norms. With our method, networks may contain traditional activations like ReLU or LeakyReLU, activations like GroupSort or the related MinMax and FullSort, which have been of increasing interest in the context of Lipschitz constrained networks, or even other piecewise linear functions like MaxPool.
翻译:研究表明,神经网络的Lipschitz常数可用于推导鲁棒性保证、通过正则化提升泛化能力,甚至构建可逆网络。因此,针对不同类型的网络,已发展出多种在边界紧致性和计算成本上各异的Lipschitz常数近似方法。然而,关于精确计算方法的研究相对较少,且该问题已被证明是NP难的。尽管如此,在某些应用中人们可能愿意接受精确方法的计算成本,例如新方法的基准测试,或针对敏感数据的小型模型鲁棒性保证计算。遗憾的是,现有精确算法仅局限于ReLU激活网络,而这类网络在Lipschitz约束网络中已知存在严重缺陷。为此,我们提出LipBaB算法的泛化版本,用于计算任意分段线性神经网络在$p$-范数下的精确Lipschitz常数。我们的方法可兼容传统激活函数(如ReLU或LeakyReLU)、在Lipschitz约束网络中日益受关注的激活函数(如GroupSort及其相关的MinMax与FullSort),乃至其他分段线性函数(如MaxPool)。