Popular word embedding methods such as GloVe and Word2Vec are related to the factorization of the pointwise mutual information (PMI) matrix. In this paper, we link correspondence analysis (CA) to the factorization of the PMI matrix. CA is a dimensionality reduction method that uses singular value decomposition (SVD), and we show that CA is mathematically close to the weighted factorization of the PMI matrix. In addition, we present variants of CA that turn out to be successful in the factorization of the word-context matrix, i.e. CA applied to a matrix where the entries undergo a square-root transformation (ROOT-CA) and a root-root transformation (ROOTROOT-CA). While this study focuses on traditional static word embedding methods, to extend the contribution of this paper, we also include a comparison of transformer-based encoder BERT, i.e. contextual word embedding, with these traditional methods. An empirical comparison among CA- and PMI-based methods as well as BERT shows that overall results of ROOT-CA and ROOTROOT-CA are slightly better than those of the PMI-based methods and are competitive with BERT.
翻译:流行的词嵌入方法(如GloVe和Word2Vec)与点互信息(PMI)矩阵的分解密切相关。本文建立了对应分析(CA)与PMI矩阵分解之间的联系。CA是一种使用奇异值分解(SVD)的降维方法,我们证明CA在数学上接近于PMI矩阵的加权分解。此外,我们提出了CA的几种变体,这些变体在词-上下文矩阵的分解中表现成功,即应用于经过平方根变换(ROOT-CA)和双平方根变换(ROOTROOT-CA)的矩阵的CA。虽然本研究主要关注传统的静态词嵌入方法,但为了扩展本文的贡献,我们还比较了基于Transformer的编码器BERT(即上下文词嵌入)与这些传统方法。CA方法、基于PMI的方法以及BERT之间的实证比较表明,ROOT-CA和ROOTROOT-CA的总体结果略优于基于PMI的方法,并且与BERT具有竞争力。