This paper addresses the estimation of signals with sublinear sparsity sent over the additive white Gaussian noise channel. This fundamental problem arises in designing denoisers used in message-passing algorithms for sublinear sparsity. The main results are direct and converse theorems in the sublinear sparsity limit, where the signal sparsity grows sublinearly in the signal dimension as the signal dimension tends to infinity. As a direct theorem, the maximum likelihood estimator is proved to achieve vanishing square error in the sublinear sparsity limit if the noise variance is smaller than a threshold. As a converse theorem, all estimators cannot achieve square errors smaller than the signal power under a mild condition if the noise variance is larger than another threshold. In particular, the two thresholds coincide with each other when non-zero signals have constant amplitude. These results imply the asymptotic optimality of an existing separable Bayesian estimator used in approximate message-passing for sublinear sparsity.
翻译:暂无翻译