Autoregressive learning of time-stepping operators offers an effective approach to data-driven PDE simulation on grids. For conservation laws, however, long-horizon rollouts are often destabilized when learned updates violate global conservation and, in many applications, additional state bounds such as nonnegative mass and densities or concentrations constrained to [0,1]. Enforcing these coupled constraints via direct next-state regression remains difficult. We introduce a framework for learning conservative transport operators on regular grids, inspired by lattice Boltzmann-style discrete-velocity transport representations. Instead of predicting the next state, the model outputs local transport operators that update cells through neighborhood exchanges, guaranteeing discrete conservation by construction. For bounded quantities, we parameterize transport within a capacity-constrained feasible set, enforcing bounds structurally rather than by post-hoc clipping. We validate FluxNet on 1D convection-diffusion, 2D shallow water equations, 1D traffic flow, and 2D spinodal decomposition. Experiments on shallow-water equations and traffic flow show improved rollout stability and physical consistency over strong baselines. On phase-field spinodal decomposition, the method enables large time-steps with long-range transport, accelerating simulation while preserving microstructure evolution in both pointwise and statistical measures.
翻译:基于时间步进算子的自回归学习为网格上的数据驱动偏微分方程模拟提供了一种有效方法。然而对于守恒定律,当学习到的更新违反全局守恒性时,长期推演往往失稳;在许多应用中还需满足额外状态约束,如非负质量以及密度或浓度限制在[0,1]区间。通过直接回归下一状态来强制这些耦合约束仍然困难。受格子玻尔兹曼风格离散速度输运表示的启发,我们提出了一种在规则网格上学习保守输运算子的框架。该模型不直接预测下一状态,而是输出通过邻域交换更新网格单元的局部输运算子,从而在结构上保证离散守恒性。对于有界物理量,我们将输运过程参数化为容量约束可行集内的操作,通过结构设计而非后处理截断来强制边界约束。我们在1D对流扩散、2D浅水方程、1D交通流和2D旋节分解问题上验证了FluxNet。在浅水方程和交通流实验中的结果表明,相较于强基线方法,本方法显著提升了推演稳定性与物理一致性。在相场旋节分解问题中,该方法支持具有长程输运特性的大时间步长,在保持微观结构演化的逐点度量和统计度量精度的同时,显著加速了模拟过程。