Distributed computing enables scalable machine learning by distributing tasks across multiple nodes, but ensuring privacy in such systems remains a challenge. This paper introduces a novel private coded distributed computing model that integrates privacy constraints to keep task assignments hidden. By leveraging placement delivery arrays (PDAs), we design an extended PDA framework to characterize achievable computation and communication loads under privacy constraints. By constructing two classes of extended PDAs, we explore the trade-offs between computation and communication, showing that although privacy increases communication overhead, it can be significantly alleviated through optimized PDA-based coded strategies.
翻译:分布式计算通过将任务分配至多个节点实现了机器学习的可扩展性,但在此类系统中确保隐私性仍具挑战。本文提出了一种新颖的隐私编码分布式计算模型,该模型整合了隐私约束以隐藏任务分配信息。通过利用放置交付阵列(PDA),我们设计了一个扩展的PDA框架来刻画隐私约束下可达到的计算与通信负载。通过构造两类扩展PDA,我们深入探究了计算与通信之间的权衡关系,结果表明:尽管隐私要求会增加通信开销,但通过基于PDA的优化编码策略可使其得到显著缓解。