Standard techniques for differentially private estimation, such as Laplace or Gaussian noise addition, require guaranteed bounds on the sensitivity of the estimator in question. But such sensitivity bounds are often large or simply unknown. Thus we seek differentially private methods that can be applied to arbitrary black-box functions. A handful of such techniques exist, but all are either inefficient in their use of data or require evaluating the function on exponentially many inputs. In this work we present a scheme that trades off between statistical efficiency (i.e., how much data is needed) and oracle efficiency (i.e., the number of evaluations). We also present lower bounds showing the near-optimality of our scheme.
翻译:标准的差分隐私估计技术,如拉普拉斯或高斯噪声添加,要求对估计量的敏感度具有可保证的边界。然而此类敏感度边界往往过大或完全未知。因此我们寻求可应用于任意黑盒函数的差分隐私方法。现有少数此类技术均存在缺陷:要么数据利用效率低下,要么需对指数级数量的输入进行函数求值。本研究提出一种在统计效率(即所需数据量)与查询效率(即函数求值次数)之间取得平衡的方案,同时给出证明该方案近乎最优性的下界。