This paper considers the problem of soft guessing under a logarithmic loss distortion measure while allowing errors. We find an optimal guessing strategy, and derive single-shot upper and lower bounds for the minimal guessing moments as well as an asymptotic expansion for i.i.d. sources. These results are extended to the case where side information is available to the guesser. Furthermore, a connection between soft guessing allowing errors and variable-length lossy source coding under logarithmic loss is demonstrated. The R\'enyi entropy, the smooth R\'enyi entropy, and their conditional versions play an important role.
翻译:本文研究了在对数损失失真度量下允许错误的软猜测问题。我们找到了最优猜测策略,推导了最小猜测矩的单次上界与下界,并对独立同分布信源给出了渐近展开。这些结果被推广至猜测者可获得边信息的情形。此外,本文论证了允许错误的软猜测与对数损失下变长有损信源编码之间的联系。Rényi熵、平滑Rényi熵及其条件形式在其中起着关键作用。