Error correcting codes~(ECCs) are indispensable for reliable transmission in communication systems. The recent advancements in deep learning have catalyzed the exploration of ECC decoders based on neural networks. Among these, transformer-based neural decoders have achieved state-of-the-art decoding performance. In this paper, we propose a novel Cross-attention Message-Passing Transformer~(CrossMPT). CrossMPT iteratively updates two types of input vectors (i.e., magnitude and syndrome vectors) using two masked cross-attention blocks. The mask matrices in these cross-attention blocks are determined by the code's parity-check matrix that delineates the relationship between magnitude and syndrome vectors. Our experimental results show that CrossMPT significantly outperforms existing neural network-based decoders, particularly in decoding low-density parity-check codes. Notably, CrossMPT also achieves a significant reduction in computational complexity, achieving over a 50\% decrease in its attention layers compared to the original transformer-based decoder, while retaining the computational complexity of the remaining layers.
翻译:纠错码是通信系统中实现可靠传输不可或缺的技术。近年来深度学习的进步推动了基于神经网络的纠错码解码器研究。其中,基于Transformer的神经解码器已取得最先进的解码性能。本文提出了一种新型交叉注意力消息传递Transformer(CrossMPT)。CrossMPT通过两个掩蔽交叉注意力模块迭代更新两类输入向量(即幅度向量和伴随式向量),这两个模块中的掩蔽矩阵由刻画幅度向量与伴随式关系的校验矩阵决定。实验结果表明,CrossMPT在低密度奇偶校验码解码任务中显著优于现有基于神经网络的解码器。值得注意的是,CrossMPT在保持其余层计算复杂度的同时,其注意力层实现了较原始Transformer解码器超过50%的计算复杂度降低。