In this paper, we show two new variants of multi-view k-means (MVKM) algorithms to address multi-view data. The general idea is to outline the distance between $h$-th view data points $x_i^h$ and $h$-th view cluster centers $a_k^h$ in a different manner of centroid-based approach. Unlike other methods, our proposed methods learn the multi-view data by calculating the similarity using Euclidean norm in the space of Gaussian-kernel, namely as multi-view k-means with exponent distance (MVKM-ED). By simultaneously aligning the stabilizer parameter $p$ and kernel coefficients $\beta^h$, the compression of Gaussian-kernel based weighted distance in Euclidean norm reduce the sensitivity of MVKM-ED. To this end, this paper designated as Gaussian-kernel multi-view k-means (GKMVKM) clustering algorithm. Numerical evaluation of five real-world multi-view data demonstrates the robustness and efficiency of our proposed MVKM-ED and GKMVKM approaches.
翻译:本文提出了多视图k-means(MVKM)算法的两种新变体,用于处理多视图数据。核心思想是以质心方法的差异化方式,刻画第$h$个视图数据点$x_i^h$与第$h$个视图聚类中心$a_k^h$之间的距离。与现有方法不同,本文提出的方法在高斯核空间中使用欧几里得范数计算相似度以学习多视图数据,即指数距离多视图k-means(MVKM-ED)。通过同时对齐稳定化参数$p$和核系数$\beta^h$,基于高斯核的欧几里得加权距离压缩过程降低了MVKM-ED的敏感性。据此,本文将其命名为高斯核多视图k-means(GKMVKM)聚类算法。在五个真实世界多视图数据集上的数值评估表明,本文提出的MVKM-ED和GKMVKM方法具有鲁棒性和高效性。