This work considers large-data asymptotics for t-distributed stochastic neighbor embedding (tSNE), a widely-used non-linear dimension reduction algorithm. We identify an appropriate continuum limit of the tSNE objective function, which can be viewed as a combination of a kernel-based repulsion and an asymptotically-vanishing Laplacian-type regularizer. As a consequence, we show that embeddings of the original tSNE algorithm cannot have any consistent limit as $n \to \infty$. We propose a rescaled model which mitigates the asymptotic decay of the attractive energy, and which does have a consistent limit.
翻译:本研究探讨了t分布随机邻域嵌入(tSNE)这一广泛应用的非线性降维算法在大数据条件下的渐近性质。我们确定了tSNE目标函数的恰当连续极限,该极限可视为基于核的排斥力与渐近消失的拉普拉斯型正则化项的组合。由此证明,原始tSNE算法生成的嵌入在$n \to \infty$时不可能存在任何一致极限。我们提出了一种重标度模型,该模型缓解了吸引能量的渐近衰减,并确实具有一致极限。