We introduce a generalized Fourier ratio, the \(\ell^1/\ell^2\) norm ratio of coefficients in an \emph{arbitrary} orthonormal system, as a single, basis-invariant measure of \emph{effective dimension} that governs fundamental limits across signal recovery, localization, and learning. First, we prove that functions with small Fourier ratio can be stably recovered from random missing samples via \(\ell^1\) minimization, extending and clarifying compressed sensing guarantees for general bounded orthonormal systems. Second, we establish a sharp \emph{localization obstruction}: any attempt to localize recovery to subslices of a product space necessarily inflates the Fourier ratio by a factor scaling with the square root of the slice count, demonstrating that global complexity cannot be distributed locally. Finally, we show that the same parameter controls key complexity-theoretic measures: it provides explicit upper bounds on Kolmogorov rate-distortion description length and on the statistical query (SQ) dimension of the associated function class. These results unify analytic, algorithmic, and learning-theoretic constraints under a single complexity parameter, revealing the Fourier ratio as a fundamental invariant in information-theoretic signal processing.
翻译:我们引入了一种广义傅里叶比——即任意正交系统中系数的\(\ell^1/\ell^2\)范数比——作为一个单一的、基不变的“有效维度”度量,它支配着信号恢复、定位与学习中的基本极限。首先,我们证明具有较小傅里叶比的函数可以通过\(\ell^1\)最小化从随机缺失样本中稳定恢复,从而扩展并澄清了一般有界正交系统的压缩感知保证。其次,我们建立了一个尖锐的“定位障碍”:任何试图将恢复定位到乘积空间子切片上的尝试,都必然会使傅里叶比膨胀一个与切片数量的平方根成比例的因子,这表明全局复杂度无法被局部分配。最后,我们证明同一参数控制着关键的计算复杂度理论度量:它为科尔莫戈洛夫率失真描述长度以及相关函数类的统计查询(SQ)维度提供了明确的上界。这些结果将分析、算法与学习理论约束统一在一个单一的复杂度参数下,揭示了傅里叶比作为信息论信号处理中的一个基本不变量。