Learning discrete neural samplers is challenging due to the lack of gradients and combinatorial complexity. While stochastic optimal control (SOC) and Schrödinger bridge (SB) provide principled solutions, efficient SOC solvers like adjoint matching (AM), which excel in continuous domains, remain unexplored for discrete spaces. We bridge this gap by revealing that the core mechanism of AM is $\mathit{state}\text{-}\mathit{space~agnostic}$, and introduce $\mathbf{discrete~ASBS}$, a unified framework that extends AM and adjoint Schrödinger bridge sampler (ASBS) to discrete spaces. Theoretically, we analyze the optimality conditions of the discrete SB problem and its connection to SOC, identifying a necessary cyclic group structure on the state space to enable this extension. Empirically, discrete ASBS achieves competitive sample quality with significant advantages in training efficiency and scalability.
翻译:学习离散神经采样器因梯度缺失和组合复杂性而具有挑战性。虽然随机最优控制(SOC)和薛定谔桥(SB)提供了原理性的解决方案,但在连续域表现出色的高效SOC求解器(如伴随匹配(AM))在离散空间中仍未得到探索。我们通过揭示AM的核心机制是$\mathit{状态空间无关的}$来弥合这一差距,并引入$\mathbf{离散ASBS}$,这是一个将AM和伴随薛定谔桥采样器(ASBS)扩展到离散空间的统一框架。理论上,我们分析了离散SB问题的最优性条件及其与SOC的联系,确定了在状态空间上实现此扩展所必需的循环群结构。实证表明,离散ASBS在样本质量上具有竞争力,并在训练效率和可扩展性方面具有显著优势。