The Principle of Maximum Entropy is a rigorous technique for estimating an unknown distribution given partial information while simultaneously minimizing bias. However, an important requirement for applying the principle is that the available information be provided error-free (Jaynes, 1982). We relax this requirement using a memoryless communication channel as a framework to derive a new, more general principle. We show our new principle provides an upper bound on the entropy of the unknown distribution and the amount of information lost due to the use of a given communications channel is unknown unless the unknown distribution's entropy is also known. Using our new principle we provide a new interpretation of the classic principle and experimentally show its performance relative to the classic principle and some other generally applicable solutions.
翻译:最大熵原理是一种严格的估计技术,用于在给定部分信息的同时最小化偏差来估计未知分布。然而,应用该原理的一个重要要求是可用信息必须无误差提供(Jaynes, 1982)。我们通过使用无记忆通信信道作为框架来放宽这一要求,从而推导出一个新的、更通用的原理。我们证明,新原理为未知分布的熵提供了一个上界,并且除非已知未知分布的熵,否则因使用给定通信信道而损失的信息量是未知的。利用新原理,我们对经典原理提供了新的解释,并通过实验展示了其相对于经典原理及其他一些通用解决方案的性能。