Computational efficiency and robustness are essential in process modeling, optimization, and control for real-world engineering applications. While neural network-based approaches have gained significant attention in recent years, conventional neural networks often fail to address these two critical aspects simultaneously or even independently. Inspired by natural physical systems and established literature, input convex architectures are known to enhance computational efficiency in optimization tasks, whereas Lipschitz-constrained architectures improve robustness. However, combining these properties within a single model requires careful review, as inappropriate methods for enforcing one property can undermine the other. To overcome this, we introduce a novel network architecture, termed Input Convex Lipschitz Recurrent Neural Networks (ICL-RNNs). This architecture seamlessly integrates the benefits of convexity and Lipschitz continuity, enabling fast and robust neural network-based modeling and optimization. The ICL-RNN outperforms existing recurrent units in both computational efficiency and robustness. Additionally, it has been successfully applied to practical engineering scenarios, such as chemical process modeling and the modeling and control of Organic Rankine Cycle-based waste heat recovery systems. Source code is available at https://github.com/killingbear999/ICLRNN.
翻译:在实际工程应用中,过程建模、优化与控制的计算效率与鲁棒性至关重要。尽管基于神经网络的方法近年来受到广泛关注,但传统神经网络往往难以同时甚至独立地满足这两个关键要求。受自然物理系统和既有文献启发,输入凸架构已知能提升优化任务中的计算效率,而Lipschitz约束架构则能增强鲁棒性。然而,将这两种特性结合于单一模型需要审慎考量,因为不当的约束方法可能相互削弱。为此,我们提出一种新颖的网络架构——输入凸Lipschitz循环神经网络(ICL-RNNs)。该架构无缝整合了凸性与Lipschitz连续性的优势,实现了快速且鲁棒的基于神经网络的建模与优化。ICL-RNN在计算效率与鲁棒性方面均优于现有循环单元。此外,它已成功应用于实际工程场景,如化工过程建模以及基于有机朗肯循环的余热回收系统建模与控制。源代码发布于 https://github.com/killingbear999/ICLRNN。