The use of machine learning to predict wave dynamics is a topic of growing interest, but commonly-used deep learning approaches suffer from a lack of interpretability of the trained models. Here we present an interpretable machine learning framework for analyzing the nonlinear evolution dynamics of optical wavepackets in complex wave media. We use sparse regression to reduce microscopic discrete lattice models to simpler effective continuum models which can accurately describe the dynamics of the wavepacket envelope. We apply our approach to valley-Hall domain walls in honeycomb photonic lattices of laser-written waveguides with Kerr-type nonlinearity and different boundary shapes. The reconstructed equations accurately reproduce the linear dispersion and nonlinear effects including self-steepening and self-focusing. This scheme is proven free of the a priori limitations imposed by the underlying hierarchy of scales traditionally employed in asymptotic analytical methods. It represents a powerful interpretable machine learning technique of interest for advancing design capabilities in photonics and framing the complex interaction-driven dynamics in various topological materials.
翻译:利用机器学习预测波动力学是一个日益受到关注的话题,但常用的深度学习方法存在训练模型缺乏可解释性的问题。本文提出了一种可解释的机器学习框架,用于分析复杂波介质中光波包的非线性演化动力学。我们采用稀疏回归方法,将微观离散晶格模型简化为能够准确描述波包包络动力学的更简单有效连续模型。我们将该方法应用于具有克尔型非线性和不同边界形状的激光直写波导蜂窝光子晶格中的谷霍尔畴壁。重构方程精确再现了线性色散以及包括自陡峭和自聚焦在内的非线性效应。该方案被证明不受传统渐近解析方法所依赖的尺度层次结构所施加的先验限制。它代表了一种强大的可解释机器学习技术,对于提升光子学设计能力以及构建各种拓扑材料中复杂的相互作用驱动动力学框架具有重要意义。