Pairwise distance-based costs are crucial for self-supervised and contrastive feature learning. Mixture Density Networks (MDNs) are a widely used approach for generative models and density approximation, using neural networks to produce multiple centers that define a Gaussian mixture. By combining MDNs with contrastive costs, this paper proposes data density approximation using four types of kernelized matrix costs: the scalar cost, the vector-matrix cost, the matrix-matrix cost (the trace of Schur complement), and the SVD cost (the nuclear norm), for learning multiple centers required to define a mixture density.
翻译:基于成对距离的代价函数在自监督与对比特征学习中至关重要。混合密度网络(MDN)作为生成模型与密度逼近的常用方法,通过神经网络生成定义高斯混合分布的多个中心。本文通过将MDN与对比代价相结合,提出使用四类核化矩阵代价进行数据密度逼近:标量代价、向量-矩阵代价、矩阵-矩阵代价(舒尔补的迹)以及SVD代价(核范数),以学习定义混合密度所需的多个中心。