Training and inference with large machine learning models that far exceed the memory capacity of individual devices necessitates the design of distributed architectures, forcing one to contend with communication constraints. We present a framework for distributed computation over a quantum network in which data is encoded into specialized quantum states. We prove that for models within this framework, inference and training using gradient descent can be performed with exponentially less communication compared to their classical analogs, and with relatively modest overhead relative to standard gradient-based methods. We show that certain graph neural networks are particularly amenable to implementation within this framework, and moreover present empirical evidence that they perform well on standard benchmarks. To our knowledge, this is the first example of exponential quantum advantage for a generic class of machine learning problems that hold regardless of the data encoding cost. Moreover, we show that models in this class can encode highly nonlinear features of their inputs, and their expressivity increases exponentially with model depth. We also delineate the space of models for which exponential communication advantages hold by showing that they cannot hold for linear classification. Our results can be combined with natural privacy advantages in the communicated quantum states that limit the amount of information that can be extracted from them about the data and model parameters. Taken as a whole, these findings form a promising foundation for distributed machine learning over quantum networks.
翻译:当训练和推理所需的大型机器学习模型远超单个设备的内存容量时,分布式架构的设计变得必要,这迫使我们必须应对通信约束。我们提出了一个在量子网络上进行分布式计算的框架,其中数据被编码到专门的量子态中。我们证明,对于该框架内的模型,使用梯度下降进行推理和训练所需的通信量,相较于其经典对应方法,可呈指数级减少,并且相对于标准的基于梯度的方法,其开销相对适中。我们表明,某些图神经网络特别适合在该框架内实现,并且进一步提供了经验证据,证明它们在标准基准测试上表现良好。据我们所知,这是首个针对一类通用的机器学习问题展示指数级量子优势的示例,且该优势与数据编码成本无关。此外,我们证明,此类模型可以编码其输入的高度非线性特征,并且其表达能力随模型深度呈指数增长。我们还通过证明线性分类无法实现指数级通信优势,界定了此类优势所适用的模型空间。我们的结果可以与所传输量子态中固有的隐私优势相结合,这些优势限制了从量子态中提取有关数据和模型参数的信息量。总体而言,这些发现为在量子网络上进行分布式机器学习奠定了有前景的基础。