We study estimation and detection of high-order moment and cumulant tensors from $n$ i.i.d.\ observations of a $p$-dimensional random vector, with performance measured in tensor spectral norm. Under sub-Gaussianity, we show that the minimax rate for estimating the order-$d$ moment and cumulant tensors is $\sqrt{p/n}\wedge 1$. In contrast to covariance estimation, the sample moment tensor is generally not rate-optimal for $d\ge 3$, and we construct an estimator that attains the minimax rate up to logarithmic factors. On the computational side, we study testing whether the $d$-th order cumulant tensor vanishes after whitening. Using the low-degree polynomial framework, we provide evidence that detection is computationally hard when $n\ll p^{d/2}$. At the same time, we identify a regime in which an efficiently computable estimator achieves error smaller than the separation at which low-degree tests can reliably distinguish the null from the alternative. This reveals an unusual reverse detection--estimation gap: computationally efficient detection can be harder than computationally efficient estimation. The underlying reason is that the relevant loss, tensor spectral norm, is itself NP-hard to compute, creating a new form of computational--statistical gap.
翻译:暂无翻译