Pairwise distance-based costs are crucial for self-supervised and contrastive feature learning. Mixture Density Networks (MDNs) are a widely used approach for generative models and density approximation, using neural networks to produce multiple centers that define a Gaussian mixture. By combining MDNs with contrastive costs, this paper proposes data density approximation using four types of kernelized matrix costs in the Hilbert space: the scalar cost, the vector-matrix cost, the matrix-matrix cost (the trace of Schur complement), and the SVD cost (the nuclear norm), for learning multiple centers required to define a mixture density.
翻译:基于成对距离的代价在自监督与对比特征学习中至关重要。混合密度网络是一种广泛使用的生成模型与密度近似方法,它利用神经网络产生多个中心以定义高斯混合。通过将混合密度网络与对比代价相结合,本文提出在希尔伯特空间中使用四类核化矩阵代价进行数据密度近似:标量代价、向量-矩阵代价、矩阵-矩阵代价(舒尔补的迹)以及奇异值分解代价(核范数),用于学习定义混合密度所需的多个中心。