Pairwise distance-based costs are crucial for self-supervised and contrastive feature learning. Mixture Density Networks (MDNs) are a widely used approach for generative models and density approximation, using neural networks to produce multiple centers that define a Gaussian mixture. By combining MDNs with contrastive costs, this paper proposes data density approximation using four types of kernelized matrix costs: the scalar cost, the vector-matrix cost, the matrix-matrix cost (the trace of Schur complement), and the SVD cost (the nuclear norm), for learning multiple centers required to define a mixture density.
翻译:基于成对距离的代价函数在自监督与对比特征学习中至关重要。混合密度网络(MDNs)作为一种广泛使用的生成模型与密度近似方法,通过神经网络生成多个中心以定义高斯混合分布。本文将MDNs与对比代价相结合,提出使用四类核化矩阵代价进行数据密度近似:标量代价、向量-矩阵代价、矩阵-矩阵代价(舒尔补的迹)以及SVD代价(核范数),旨在学习定义混合密度所需的多个中心。