This paper derives a complete set of quadratic constraints (QCs) for the repeated ReLU. The complete set of QCs is described by a collection of $2^{n_v}$ matrix copositivity conditions where $n_v$ is the dimension of the repeated ReLU. We also show that only two functions satisfy all QCs in our complete set: the repeated ReLU and a repeated "flipped" ReLU. Thus our complete set of QCs bounds the repeated ReLU as tight as possible up to the sign invariance inherent in quadratic forms. We derive a similar complete set of incremental QCs for repeated ReLU, which can potentially lead to less conservative Lipschitz bounds for ReLU networks than the standard LipSDP approach. Finally, we illustrate the use of the complete set of QCs to assess stability and performance for recurrent neural networks with ReLU activation functions. The stability/performance condition combines Lyapunov/dissipativity theory with the QCs for repeated ReLU. A numerical implementation is given and demonstrated via a simple example.
翻译:本文推导了重复ReLU的完整二次约束集。该完整二次约束集由$2^{n_v}$个矩阵余正性条件描述,其中$n_v$为重复ReLU的维度。我们还证明仅有两个函数满足完整集中的所有二次约束:重复ReLU与重复"翻转"ReLU。因此我们的完整二次约束集在二次形式固有的符号不变性范围内,对重复ReLU给出了尽可能紧的边界。针对重复ReLU,我们推导了类似的完整增量二次约束集,该集合可能为ReLU网络提供比标准LipSDP方法保守性更低的Lipschitz边界。最后,我们通过简单算例演示了如何利用完整二次约束集评估具有ReLU激活函数的循环神经网络的稳定性与性能。该稳定性/性能条件将李雅普诺夫/耗散性理论与重复ReLU的二次约束相结合,并给出了数值实现方案。