This paper investigates distributed joint source-channel coding (JSCC) for correlated image semantic transmission over wireless channels. In this setup, correlated images at different transmitters are separately encoded and transmitted through dedicated channels for joint recovery at the receiver. We propose a novel distributed nonlinear transform source-channel coding (D-NTSCC) framework. Unlike existing learning-based approaches that implicitly learn source correlation in a purely data-driven manner, our method explicitly models the source correlation through joint distribution. Specifically, the correlated images are separately encoded into latent representations via an encoding transform function, followed by a JSCC encoder to produce channel input symbols. A learned joint entropy model is introduced to determine the transmission rates, which more accurately approximates the joint distribution of the latent representations and captures source dependencies, thereby improving rate-distortion performance. At the receiver, a JSCC decoder and a decoding transform function reconstruct the images from the received signals, each serving as side information for recovering the other image. Therein, a transformation module is designed to align the latent representations for maximal correlation learning. Furthermore, a loss function is derived to jointly optimize encoding, decoding, and the joint entropy model, ensuring that the learned joint entropy model approximates the true joint distribution. Experiments on multi-view datasets show that D-NTSCC outperforms state-of-the-art distributed schemes, demonstrating its effectiveness in exploiting source correlation.
翻译:本文研究了无线信道中相关图像语义传输的分布式联合信源信道编码(JSCC)。在该设置中,不同发射端的相关图像分别进行编码,并通过专用信道传输,以便在接收端进行联合恢复。我们提出了一种新颖的分布式非线性变换信源信道编码(D-NTSCC)框架。与现有基于学习的方法(纯粹以数据驱动的方式隐式学习信源相关性)不同,我们的方法通过联合分布显式建模信源相关性。具体而言,相关图像首先通过编码变换函数分别编码为潜在表示,随后由JSCC编码器生成信道输入符号。我们引入了一个学习的联合熵模型来确定传输速率,该模型能更准确地近似潜在表示的联合分布并捕获信源依赖性,从而提升率失真性能。在接收端,JSCC解码器和解码变换函数从接收信号中重建图像,每个重建图像作为恢复另一幅图像的边信息。其中,设计了一个变换模块来对齐潜在表示,以实现最大相关性学习。此外,推导了一个损失函数来联合优化编码、解码和联合熵模型,确保学习的联合熵模型逼近真实的联合分布。在多视图数据集上的实验表明,D-NTSCC优于最先进的分布式方案,证明了其在利用信源相关性方面的有效性。