Several quantum algorithms for linear algebra problems, and in particular quantum machine learning problems, have been "dequantized" in the past few years. These dequantization results typically hold when classical algorithms can access the data via length-squared sampling. In this work we investigate how robust these dequantization results are. We introduce the notion of approximate length-squared sampling, where classical algorithms are only able to sample from a distribution close to the ideal distribution in total variation distance. While quantum algorithms are natively robust against small perturbations, current techniques in dequantization are not. Our main technical contribution is showing how many techniques from randomized linear algebra can be adapted to work under this weaker assumption as well. We then use these techniques to show that the recent low-rank dequantization framework by Chia, Gily\'en, Li, Lin, Tang and Wang (JACM 2022) and the dequantization framework for sparse matrices by Gharibian and Le Gall (STOC 2022), which are both based on the Quantum Singular Value Transformation, can be generalized to the case of approximate length-squared sampling access to the input. We also apply these results to obtain a robust dequantization of many quantum machine learning algorithms, including quantum algorithms for recommendation systems, supervised clustering and low-rank matrix inversion.
翻译:近年来,若干针对线性代数问题(尤其是量子机器学习问题)的量子算法已被"去量子化"。这些去量子化结果通常建立在经典算法能够通过长度平方采样访问数据的前提下。本研究旨在探究此类去量子化结果的鲁棒性。我们提出了近似长度平方采样的概念,即经典算法仅能从与理想分布全变差距离接近的分布中进行采样。尽管量子算法本身对小扰动具有鲁棒性,但现有去量子化技术却非如此。我们的主要技术贡献在于证明如何使随机线性代数中的多种技术适配于此弱化假设条件。基于这些技术,我们进一步论证:Chia、Gilyén、Li、Lin、Tang和Wang(JACM 2022)提出的低秩去量子化框架,以及Gharibian和Le Gall(STOC 2022)针对稀疏矩阵的去量子化框架(两者均基于量子奇异值变换),均可推广至对输入数据具有近似长度平方采样访问权限的情形。我们还应用这些结果实现了对多种量子机器学习算法的鲁棒性去量子化,包括推荐系统、监督聚类和低秩矩阵求逆的量子算法。