Inspired by the connection between classical regret measures employed in universal prediction and R\'{e}nyi divergence, we introduce a new class of universal predictors that depend on a real parameter $\alpha\geq 1$. This class interpolates two well-known predictors, the mixture estimators, that include the Laplace and the Krichevsky-Trofimov predictors, and the Normalized Maximum Likelihood (NML) estimator. We point out some advantages of this new class of predictors and study its benefits from two complementary viewpoints: (1) we prove its optimality when the maximal R\'{e}nyi divergence is considered as a regret measure, which can be interpreted operationally as a middle ground between the standard average and worst-case regret measures; (2) we discuss how it can be employed when NML is not a viable option, as an alternative to other predictors such as Luckiness NML. Finally, we apply the $\alpha$-NML predictor to the class of discrete memoryless sources (DMS), where we derive simple formulas to compute the predictor and analyze its asymptotic performance in terms of worst-case regret.
翻译:受通用预测中经典遗憾度量与Rényi散度之间联系的启发,我们引入了一类依赖于实参数$\alpha\geq 1$的新型通用预测器。此类预测器插值了两种著名预测器:包含拉普拉斯预测器和Krichevsky-Trofimov预测器的混合估计器,以及归一化最大似然(NML)估计器。我们指出了这类新预测器的若干优势,并从两个互补视角研究其益处:(1)当以最大Rényi散度作为遗憾度量时,我们证明了该预测器的最优性,这可在操作意义上解释为标准平均遗憾度量与最坏情况遗憾度量之间的折衷;(2)我们讨论了在NML不可行时,如何将其作为Luckiness NML等其他预测器的替代方案加以应用。最后,我们将$\alpha$-NML预测器应用于离散无记忆信源(DMS)类,推导出计算该预测器的简明公式,并从最坏情况遗憾的角度分析了其渐近性能。