We investigate the high-probability estimation of discrete distributions from an \iid sample under $\chi^2$-divergence loss. Although the minimax risk in expectation is well understood, its high-probability counterpart remains largely unexplored. We provide sharp upper and lower bounds for the classical Laplace estimator, showing that it achieves optimal performance among estimators that do not rely on the confidence level. We further characterize the minimax high-probability risk for any estimator and demonstrate that it can be attained through a simple smoothing strategy. Our analysis highlights an intrinsic separation between asymptotic and non-asymptotic guarantees, with the latter suffering from an unavoidable overhead. This work sharpens existing guarantees and advances the theoretical understanding of divergence-based estimation.
翻译:本研究探讨在χ²散度损失下,基于独立同分布样本对离散分布进行高概率估计的问题。尽管期望意义下的极小极大风险已有充分研究,但其高概率对应问题仍鲜有探索。我们为经典拉普拉斯估计量提供了尖锐的上界与下界,证明其在无需依赖置信水平的估计量中达到最优性能。进一步刻画了任意估计量的极小极大高概率风险,并证明可通过简单的平滑策略实现该风险。分析揭示了渐近保证与非渐近保证之间的本质分离,后者存在不可避免的额外代价。本工作强化了现有理论保证,推进了基于散度的估计理论发展。