Existing distribution compression methods reduce dataset size by minimising the Maximum Mean Discrepancy (MMD) between original and compressed sets, but modern datasets are often large in both sample size and dimensionality. We propose Bilateral Distribution Compression (BDC), a two-stage framework that compresses along both axes while preserving the underlying distribution, with overall linear time and memory complexity in dataset size and dimension. Central to BDC is the Decoded MMD (DMMD), which quantifies the discrepancy between the original data and a compressed set decoded from a low-dimensional latent space. BDC proceeds by (i) learning a low-dimensional projection using the Reconstruction MMD (RMMD), and (ii) optimising a latent compressed set with the Encoded MMD (EMMD). We show that this procedure minimises the DMMD, guaranteeing that the compressed set faithfully represents the original distribution. Experiments show that across a variety of scenarios BDC can achieve comparable or superior performance to ambient-space compression at substantially lower cost.
翻译:现有的分布压缩方法通过最小化原始数据集与压缩集之间的最大均值差异(MMD)来减小数据规模,然而现代数据集通常在样本量和维度上均规模庞大。我们提出双边分布压缩(BDC),一种两阶段框架,可在保持底层分布的同时沿两个轴向进行压缩,其整体时间与内存复杂度在数据集规模和维度上均为线性。BDC的核心是解码最大均值差异(DMMD),该度量量化了原始数据与从低维潜在空间解码得到的压缩集之间的差异。BDC的执行分为两步:(i)利用重构最大均值差异(RMMD)学习低维投影;(ii)使用编码最大均值差异(EMMD)优化潜在压缩集。我们证明该流程能够最小化DMMD,从而保证压缩集能够忠实表征原始分布。实验表明,在多种场景下,BDC能以显著更低的成本实现与原始空间压缩相当或更优的性能。