We develop a weak adversarial approach to solving obstacle problems using neural networks. By employing (generalised) regularised gap functions and their properties we rewrite the obstacle problem (which is an elliptic variational inequality) as a minmax problem, providing a natural formulation amenable to learning. Our approach, in contrast to much of the literature, does not require the elliptic operator to be symmetric. We provide an error analysis for suitable discretisations of the continuous problem, estimating in particular the approximation and statistical errors. Parametrising the solution and test function as neural networks, we apply a modified gradient descent ascent algorithm to treat the problem and conclude the paper with various examples and experiments. Our solution algorithm is in particular able to easily handle obstacle problems that feature biactivity (or lack of strict complementarity), a situation that poses difficulty for traditional numerical methods.
翻译:本文提出了一种基于神经网络的弱对抗方法求解障碍问题。通过采用(广义)正则化间隙函数及其性质,我们将障碍问题(即椭圆型变分不等式)重构为一个极小极大问题,从而提供了一种适合学习的自然表述。与多数文献不同,我们的方法不要求椭圆算子具有对称性。我们对连续问题的适当离散化进行了误差分析,特别估计了近似误差与统计误差。通过将解和测试函数参数化为神经网络,我们采用改进的梯度下降上升算法处理该问题,并在文末给出了多种算例与实验。我们的求解算法尤其能够轻松处理具有双活性(或缺乏严格互补性)的障碍问题——这类情形对传统数值方法构成显著挑战。