Jacobian-Enhanced Neural Networks (JENN) are densely connected multi-layer perceptrons, whose training process is modified to predict partial derivatives accurately. Their main benefit is better accuracy with fewer training points compared to standard neural networks. These attributes are particularly desirable in the field of computer-aided design, where there is often the need to replace computationally expensive, physics-based models with fast running approximations, known as surrogate models or meta-models. Since a surrogate emulates the original model accurately in near-real time, it yields a speed benefit that can be used to carry out orders of magnitude more function calls quickly. However, in the special case of gradient-enhanced methods, there is the additional value proposition that partial derivatives are accurate, which is a critical property for one important use-case: surrogate-based optimization. This work derives the complete theory and exemplifies its superiority over standard neural nets for surrogate-based optimization.
翻译:雅可比增强神经网络(JENN)是一种密集连接的多层感知器,其训练过程经过改进以精确预测偏导数。与标准神经网络相比,其主要优势在于用更少的训练点即可获得更高的精度。这些特性在计算机辅助设计领域尤为理想,因为该领域常需用快速运行的近似模型(即代理模型或元模型)替代计算成本高昂的基于物理的模型。由于代理模型能在近实时条件下准确模拟原始模型,因此可带来速度优势,从而快速实现数量级更多的函数调用。然而,在梯度增强方法的特殊情形下,偏导数的精确性具有额外的价值主张,这对一个重要用例——基于代理的优化——至关重要。本文推导了完整理论,并证明了其在基于代理的优化中相较于标准神经网络的优越性。