We enhance coarsely quantized LDPC decoding by reusing computed check node messages from previous iterations. Typically, variable and check nodes update and replace old messages every iteration. We show that, under coarse quantization, discarding old messages entails a significant loss of mutual information. The loss is avoided with additional memory, improving performance by up to 0.23 dB. We optimize quantization with a modified information bottleneck algorithm that considers the statistics of old messages. A simple merge operation reduces memory requirements. Depending on channel conditions and code rate, memory assistance enables up to 32 % better area efficiency for 2-bit decoding.
翻译:我们通过复用先前迭代中计算出的校验节点信息,改进了粗量化LDPC解码性能。通常,变量节点和校验节点在每次迭代中都会更新并替换旧信息。我们证明,在粗量化条件下,丢弃旧信息会导致互信息的显著损失。通过增加额外的存储器可以避免这种损失,从而将性能提升高达0.23 dB。我们采用改进的信息瓶颈算法来优化量化过程,该算法考虑了旧信息的统计特性。一种简单的合并操作降低了存储需求。根据信道条件和码率,对于2比特解码,记忆辅助技术可使面积效率提升高达32%。