We investigate deterministic identification over arbitrary memoryless channels under the constraint that the error probabilities of first and second kind are exponentially small in the block length $n$, controlled by reliability exponents $E_1,E_2 \geq 0$. In contrast to the regime of slowly vanishing errors, where the identifiable message length scales as $\Theta(n\log n)$, here we find that for positive exponents linear scaling is restored, now with a rate that is a function of the reliability exponents. We give upper and lower bounds on the ensuing rate-reliability function in terms of (the logarithm of) the packing and covering numbers of the channel output set, which for small error exponents $E_1,E_2>0$ can be expanded in leading order as the product of the Minkowski dimension of a certain parametrisation the channel output set and $\log\min\{E_1,E_2\}$. These allow us to recover the previously observed slightly superlinear identification rates, and offer a different perspective for understanding them in more traditional information theory terms. We further illustrate our results with a discussion of the case of dimension zero, and extend them to classical-quantum channels and quantum channels with tensor product input restriction.
翻译:我们研究了在任意无记忆信道下的确定性识别问题,约束条件是第一类错误和第二类错误的概率在码块长度$n$上以可靠性指数$E_1,E_2 \geq 0$控制呈指数级衰减。与错误概率缓慢趋近于零的机制(其中可识别消息长度按$\Theta(n\log n)$缩放)不同,本文发现对于正的可靠性指数,线性缩放得以恢复,此时的速率是可靠性指数的函数。我们通过信道输出集的(对数)填充数与覆盖数,给出了所得速率-可靠性函数的上下界。对于较小的错误指数$E_1,E_2>0$,该函数在主导阶上可展开为信道输出集特定参数化的闵可夫斯基维数与$\log\min\{E_1,E_2\}$的乘积。这些结果使我们能够复现先前观察到的轻微超线性识别速率,并为在更传统的信息论框架下理解这些现象提供了新视角。我们进一步通过讨论零维情形来阐释所得结果,并将其推广至经典-量子信道及具有张量积输入限制的量子信道。