Pairwise distance-based costs are crucial for self-supervised and contrastive feature learning. Mixture Density Networks (MDNs) are a widely used approach for generative models and density approximation, using neural networks to produce multiple centers that define a Gaussian mixture. By combining MDNs with contrastive costs, this paper proposes data density approximation using four types of kernelized matrix costs: the scalar cost, the vector-matrix cost, the matrix-matrix cost (the trace of Schur complement), and the SVD cost (the nuclear norm), for learning multiple centers required to define a mixture density.
翻译:基于成对距离的代价函数在自监督与对比特征学习中至关重要。混合密度网络(MDNs)是一种广泛应用于生成模型与密度估计的方法,它利用神经网络生成多个中心以定义高斯混合分布。本文通过将MDNs与对比代价相结合,提出了四种核化矩阵代价用于数据密度估计:标量代价、向量-矩阵代价、矩阵-矩阵代价(舒尔补的迹)以及SVD代价(核范数),以学习定义混合密度所需的多个中心。