We study relationships between worst-case and random-noise properties of error correcting codes. More concretely, we consider connections between minimum distance, list decoding radius, and block error probability on noisy channels. A recent result of Pernice, Sprumont, and Wootters established the tight connection between list decoding radius and symmetric channel performance for linear codes. We extend this result to general codes. The proof proceeds by directly bounding the weight distribution rather than by sharp threshold techniques. We then turn to the relation between minimum distance and symmetric channel performance. The results we just described imply that a $q$-ary code of relative distance $δ$ has vanishing error probability on the symmetric channel up to the Johnson radius $J_q(δ)$. We improve upon this bound in the case of linear codes, for a range $δ$, for $q\ge 4$. In our proof we consider the \emph{erasure} properties of codes, and bound their weight distribution through inequalities introduced by Samorodnitsky. The latter part of the proof gives a more general technique that bounds the symmetric channel performance of a linear code with constant relative distance and good erasure channel performance.
翻译:暂无翻译