We introduce MIO, a transformer-based model for inferring symbolic ordinary differential equations (ODEs) from multiple observed trajectories of a dynamical system. By combining multiple instance learning with transformer-based symbolic regression, the model effectively leverages repeated observations of the same system to learn more generalizable representations of the underlying dynamics. We investigate different instance aggregation strategies and show that even simple mean aggregation can substantially boost performance. MIO is evaluated on systems ranging from one to four dimensions and under varying noise levels, consistently outperforming existing baselines.
翻译:我们提出MIO,一种基于Transformer的模型,用于从动态系统的多个观测轨迹推断符号常微分方程(ODEs)。该模型通过将多示例学习与基于Transformer的符号回归相结合,有效利用对同一系统的重复观测来学习更具泛化性的底层动力学表示。我们研究了不同的示例聚合策略,并证明即使简单的均值聚合也能显著提升性能。MIO在从一维到四维的系统及不同噪声水平下进行评估,均持续优于现有基线方法。