Hybrid neural ordinary differential equations (neural ODEs) integrate mechanistic models with neural ODEs, offering strong inductive bias and flexibility, and are particularly advantageous in data-scarce healthcare settings. However, excessive latent states and interactions from mechanistic models can lead to training inefficiency and over-fitting, limiting practical effectiveness of hybrid neural ODEs. In response, we propose a new hybrid pipeline for automatic state selection and structure optimization in mechanistic neural ODEs, combining domain-informed graph modifications with data-driven regularization to sparsify the model for improving predictive performance and stability while retaining mechanistic plausibility. Experiments on synthetic and real-world data show improved predictive performance and robustness with desired sparsity, establishing an effective solution for hybrid model reduction in healthcare applications.
翻译:混合神经常微分方程(neural ODEs)将机理模型与神经常微分方程相结合,提供了强大的归纳偏置和灵活性,在数据稀缺的医疗健康场景中尤其具有优势。然而,机理模型引入的过多潜在状态和交互作用可能导致训练效率低下和过拟合,从而限制了混合神经常微分方程的实际效能。为此,我们提出一种用于机理神经常微分方程中自动状态选择与结构优化的新型混合流程,该流程将领域知识引导的图结构调整与数据驱动的正则化相结合,以稀疏化模型,从而在保持机理合理性的同时提升预测性能与稳定性。在合成数据与真实数据上的实验表明,该方法能以期望的稀疏度实现更优的预测性能与鲁棒性,为医疗健康应用中的混合模型简化提供了一种有效解决方案。