The concept of the smoothing parameter plays a crucial role in both lattice-based and code-based cryptography, primarily due to its effectiveness in achieving nearly uniform distributions through the addition of noise. Recent research by Pathegama and Barg has determined the optimal smoothing bound for random codes under R\'enyi Divergence for any order $\alpha \in (1, \infty)$ \cite{pathegama2024r}. Considering the inherent complexity of encoding/decoding algorithms in random codes, our research introduces enhanced structural elements into these coding schemes. Specifically, this paper presents a novel derivation of the smoothing bound for random linear codes, maintaining the same order of R\'enyi Divergence and achieving optimality for any $\alpha\in (1,\infty)$. We extend this framework under KL Divergence by transitioning from random linear codes to random self-dual codes, and subsequently to random quasi-cyclic codes, incorporating progressively more structures. As an application, we derive an average-case to average-case reduction from the Learning Parity with Noise (LPN) problem to the average-case decoding problem. This reduction aligns with the parameter regime in \cite{debris2022worst}, but uniquely employs R\'enyi divergence and directly considers Bernoulli noise, instead of combining ball noise and Bernoulli noise.
翻译:平滑参数概念在格基密码和编码密码中均扮演关键角色,主要因其通过添加噪声实现近乎均匀分布的有效性。Pathegama与Barg的最新研究确定了任意阶α∈(1,∞)下Rényi散度随机码的最优平滑界\cite{pathegama2024r}。鉴于随机码编码/解码算法固有的复杂性,本研究向这些编码方案中引入了增强的结构化元素。具体而言,本文提出了随机线性码平滑界的新型推导方法,在保持相同Rényi散度阶数的同时,实现了任意α∈(1,∞)的最优性。我们通过从随机线性码过渡到随机自对偶码,进而过渡到随机准循环码,将这一框架扩展到KL散度下,并逐步融入更多结构。作为应用,我们推导出从带噪声学习奇偶性(LPN)问题到平均情况解码问题的平均情况到平均情况归约。该归约与\cite{debris2022worst}的参数设置一致,但独特地采用Rényi散度并直接考虑伯努利噪声,而非将球噪声与伯努利噪声相结合。