We consider the fundamental problem of estimating a discrete distribution on a domain of size~$K$ with high probability in Kullback-Leibler divergence. We provide upper and lower bounds on the minimax estimation rate, which show that the optimal rate is between $\big(K + \ln(K)\ln(1/\delta)\big) /n$ and $\big(K\ln\ln(K) + \ln(K)\ln(1/\delta)\big) /n$ at error probability $\delta$ and sample size $n$, which pins down the rate up to the doubly logarithmic factor $\ln \ln K$ that multiplies $K$. Our upper bound uses techniques from online learning to construct a novel estimator via online-to-batch conversion. Perhaps surprisingly, the tail behavior of the minimax rate is worse than for the squared total variation and squared Hellinger distance, for which it is $\big(K + \ln(1/\delta)\big) /n$, i.e.\ without the $\ln K$ multiplying $\ln (1/\delta)$. As a consequence, we cannot obtain a fully tight lower bound from the usual reduction to these smaller distances. Moreover, we show that this lower bound cannot be achieved by the standard lower bound approach based on a reduction to hypothesis testing, and instead we need to introduce a new reduction to what we call weak hypothesis testing. We investigate the source of the gap with other divergences further in refined results, which show that the total variation rate is achievable for Kullback-Leibler divergence after all (in fact by he maximum likelihood estimator) if we rule out outcome probabilities smaller than $O(\ln(K/\delta) / n)$, which is a vanishing set as $n$ increases for fixed $K$ and~$\delta$. This explains why minimax Kullback-Leibler estimation is more difficult than asymptotic estimation.
翻译:我们考虑在Kullback-Leibler散度意义下,以高概率估计定义域大小为~$K$的离散分布这一基本问题。我们给出了极小极大估计速率的上界与下界,结果表明在误差概率$\delta$和样本量$n$条件下,最优速率介于$\big(K + \ln(K)\ln(1/\delta)\big) /n$与$\big(K\ln\ln(K) + \ln(K)\ln(1/\delta)\big) /n$之间,从而将速率确定至乘以$K$的双对数因子$\ln \ln K$的精度。我们的上界采用在线学习技术,通过在线到批处理的转换构建了一种新颖的估计器。可能令人惊讶的是,极小大速率的尾部行为比平方全变差距离和平方Hellinger距离更差——后两者的速率为$\big(K + \ln(1/\delta)\big) /n$,即不包含$\ln K$与$\ln (1/\delta)$的乘积项。因此,我们无法通过常规的到这些较小距离的归约来获得完全紧致的下界。此外,我们证明基于假设检验归约的标准下界方法无法达到该下界,需要引入我们称为弱假设检验的新归约方法。我们通过精细化结果进一步探究了与其他散度产生差异的根源,这些结果表明:若排除小于$O(\ln(K/\delta) / n)$的结果概率(该集合随$n$增大而消失,对于固定的$K$和~$\delta$),则全变差速率实际上可通过Kullback-Leibler散度实现(事实上可通过极大似然估计器实现)。这解释了为何极小极大Kullback-Leibler估计比渐近估计更为困难。