Around the mean dimensions and rate-distortion functions, using some tools from local entropy theory this paper establishes the following main results: $(1)$ We prove that for non-ergodic measures associated with almost sure processes, the mean R\'enyi information dimension coincides with the information dimension rate. This answers a question posed by Gutman and \'Spiewak (in Around the variational principle for metric mean dimension, \emph{Studia Math.} \textbf{261}(2021) 345-360). $(2)$ We introduce four types of rate-distortion entropies and establish their relation with Kolmogorov-Sinai entropy. $(3)$ We show that for systems with the marker property, if the mean dimension is finite, then the supremum in Lindenstrauss-Tsukamoto's double variational principle can be taken over the set of ergodic measures. Additionally, the double variational principle holds for various other measure-theoretic $\epsilon$-entropies.
翻译:围绕平均维数与率失真函数,本文运用局部熵理论中的若干工具,建立了以下主要结果:$(1)$ 我们证明对于与几乎必然过程相关的非遍历测度,平均Rényi信息维数等于信息维率。这回答了Gutman与Śpiewak(在《围绕度量平均维数的变分原理》,《数学研究》\textbf{261}(2021) 345-360)中提出的一个问题。$(2)$ 我们引入了四类率失真熵,并建立了它们与Kolmogorov-Sinai熵的关系。$(3)$ 我们证明对于具有标记性质的系统,若平均维数有限,则Lindenstrauss-Tsukamoto双重变分原理中的上确界可在遍历测度集合上取得。此外,该双重变分原理对其他多种测度论$\epsilon$-熵同样成立。