We noisily observe solutions of an ordinary differential equation $\dot u = f(u)$ at given times, where $u$ lives in a $d$-dimensional state space. The model function $f$ is unknown and belongs to a H\"older-type smoothness class with parameter $\beta$. For the nonparametric problem of estimating $f$, we provide lower bounds on the error in two complementary model specifications: the snake model with few, long observed solutions and the stubble model with many short ones. The lower bounds are minimax optimal in some settings. They depend on various parameters, which in the optimal asymptotic regime leads to the same rate for the squared error in both models: it is characterized by the exponent $-2\beta/(2(\beta+1)+d)$ for the total number of observations $n$. To derive these results, we establish a master theorem for lower bounds in general nonparametric regression problems, which makes the proofs more comparable and seems to be a useful tool for future use.
翻译:我们在给定时间点上对常微分方程 $\dot u = f(u)$ 的解进行含噪观测,其中 $u$ 位于 $d$ 维状态空间中。模型函数 $f$ 未知且属于参数为 $\beta$ 的 Hölder 型光滑类。针对估计 $f$ 的非参数问题,我们在两种互补的模型设定下给出了误差下界:一种是具有少量长观测解的蛇形模型,另一种是具有大量短观测解的茬状模型。在某些设定下,这些下界是极小极大最优的。它们依赖于多种参数,在最优渐近机制下,两种模型的平方误差最终达到相同收敛速率:该速率由总观测数 $n$ 的指数 $-2\beta/(2(\beta+1)+d)$ 刻画。为推导这些结果,我们建立了适用于一般非参数回归问题的下界主定理,该定理使得证明过程更具可比性,并有望成为未来研究的有用工具。