Semantic communication, notable for ensuring quality of service by jointly optimizing source and channel coding, effectively extracts data semantics, reduces transmission length, and mitigates channel noise. However, most studies overlook multi-user scenarios and resource availability, limiting real-world application. This paper addresses this gap by focusing on downlink communication from a base station to multiple users with varying computing capacities. Users employ variants of Swin transformer models for source decoding and a simple architecture for channel decoding. We propose a novel training regimen, incorporating transfer learning and knowledge distillation to improve low-computing users' performance. Extensive simulations validate the proposed methods.
翻译:语义通信通过联合优化信源与信道编码来保障服务质量,能有效提取数据语义、缩短传输长度并抑制信道噪声。然而,现有研究大多忽略多用户场景与资源可用性,限制了实际应用。本文针对基站至多用户的下行通信场景展开研究,其中用户设备具有异构计算能力。用户端采用Swin Transformer变体进行信源解码,并采用简易架构进行信道解码。我们提出一种融合迁移学习与知识蒸馏的新型训练机制,以提升低算力用户的性能。大量仿真实验验证了所提方法的有效性。