We investigate the integration of Kolmogorov-Arnold Networks (KANs) into hard-constrained recurrent physics-informed architectures (HRPINN) to evaluate the fidelity of learned residual manifolds in oscillatory systems. Motivated by the Kolmogorov-Arnold representation theorem and preliminary gray-box results, we hypothesized that KANs would enable efficient recovery of unknown terms compared to MLPs. Through initial sensitivity analysis on configuration sensitivity, parameter scale, and training paradigm, we found that while small KANs are competitive on univariate polynomial residuals (Duffing), they exhibit severe hyperparameter fragility, instability in deeper configurations, and consistent failure on multiplicative terms (Van der Pol), generally outperformed by standard MLPs. These empirical challenges highlight limitations of the additive inductive bias in the original KAN formulation for state coupling and provide preliminary empirical evidence of inductive bias limitations for future hybrid modeling.
翻译:本研究探讨将Kolmogorov-Arnold网络(KANs)集成至硬约束循环物理信息架构(HRPINN)中,以评估振荡系统中所学残差流形的保真度。受Kolmogorov-Arnold表示定理及初步灰盒结果的启发,我们假设相较于多层感知机(MLPs),KANs能更高效地恢复未知项。通过对配置敏感性、参数尺度与训练范式的初步敏感性分析发现:虽然小型KANs在单变量多项式残差(Duffing系统)上具有竞争力,但其表现出严重的超参数脆弱性、深层配置不稳定性,且在乘法项(Van der Pol系统)上持续失效,总体表现逊于标准MLPs。这些实证挑战揭示了原始KAN公式中加性归纳偏置在状态耦合方面的局限性,并为未来混合建模的归纳偏置局限提供了初步实证依据。