We propose a simple methodology to approximate functions with given asymptotic behavior by specifically constructed terms and an unconstrained deep neural network (DNN). The methodology we describe extends to various asymptotic behaviors and multiple dimensions and is easy to implement. In this work we demonstrate it for linear asymptotic behavior in one-dimensional examples. We apply it to function approximation and regression problems where we measure approximation of only function values (``Vanilla Machine Learning''-VML) or also approximation of function and derivative values (``Differential Machine Learning''-DML) on several examples. We see that enforcing given asymptotic behavior leads to better approximation and faster convergence.
翻译:我们提出了一种简单的方法论,通过特别构造的项和无约束深度神经网络(DNN)来近似具有给定渐近行为的函数。我们所描述的方法论可扩展到多种渐近行为和多维情形,且易于实现。在本工作中,我们通过一维示例展示了其在线性渐近行为中的应用。我们将其应用于函数近似和回归问题,在多个示例中测量仅函数值的近似(“普通机器学习”-VML)或同时测量函数值及其导数值的近似(“微分机器学习”-DML)。我们发现,强制给定渐近行为能带来更好的近似效果和更快的收敛速度。