The study of parametric differential equations plays a crucial role in weather forecasting and epidemiological modeling. These phenomena are better represented using fractional derivatives due to their inherent memory or hereditary effects. This paper introduces a novel scientific machine learning approach for solving parametric time-fractional differential equations by combining traditional spectral methods with neural networks. Instead of relying on automatic differentiation techniques, commonly used in traditional Physics-Informed Neural Networks (PINNs), we propose a more efficient global discretization method based on Legendre polynomials. This approach eliminates the need to simulate the parametric fractional differential equations across multiple parameter values. By applying the Legendre-Galerkin weak formulation to the differential equation, we construct a loss function for training the neural network. The trial solutions are represented as linear combinations of Legendre polynomials, with the coefficients learned by the neural network. The convergence of this method is theoretically established, and the theoretical results are validated through numerical experiments on several well-known differential equations.
翻译:参数微分方程的研究在天气预报和流行病学建模中起着至关重要的作用。由于其固有的记忆或遗传效应,使用分数阶导数能更好地表征这些现象。本文提出了一种新颖的科学机器学习方法,通过将传统谱方法与神经网络相结合,来求解含参数的时间分数阶微分方程。不同于传统物理信息神经网络(PINNs)中常用的自动微分技术,我们提出了一种基于勒让德多项式的更高效的全局离散化方法。该方法无需在多个参数值下模拟参数分数阶微分方程。通过对微分方程应用勒让德-伽辽金弱形式,我们构建了用于训练神经网络的损失函数。试探解表示为勒让德多项式的线性组合,其系数由神经网络学习得到。该方法在理论上建立了收敛性,并通过在几个著名微分方程上的数值实验验证了理论结果。