Information theory is introduced in this lecture note with a particular emphasis on its relevance to algebraic coding theory. The document develops the mathematical foundations for quantifying uncertainty and information transmission by building upon Shannon's pioneering formulation of information, entropy, and channel capacity. Examples, including the binary symmetric channel, illustrate key concepts such as entropy, conditional entropy, mutual information, and the noisy channel model. Furthermore, the note describes the principles of maximum likelihood decoding and Shannon's noisy channel coding theorem, which characterizes the theoretical limits of reliable communication over noisy channels. Students and researchers seeking a connection between probabilistic frameworks of information theory and structural and algebraic techniques used in modern coding theory will find this work helpful.
翻译:本讲义介绍了信息论,特别强调其与代数编码理论的相关性。文档基于香农关于信息、熵和信道容量的开创性表述,建立了量化不确定性和信息传输的数学基础。通过二进制对称信道等示例,阐述了熵、条件熵、互信息以及噪声信道模型等核心概念。此外,讲义还描述了最大似然译码原理和香农噪声信道编码定理,该定理刻画了噪声信道中可靠通信的理论极限。寻求信息论概率框架与现代编码理论中结构及代数技术之间联系的学生和研究人员,将会发现本工作的参考价值。