\texttt{Mixture-Models} is an open-source Python library for fitting Gaussian Mixture Models (GMM) and their variants, such as Parsimonious GMMs, Mixture of Factor Analyzers, MClust models, Mixture of Student's t distributions, etc. It streamlines the implementation and analysis of these models using various first/second order optimization routines such as Gradient Descent and Newton-CG through automatic differentiation (AD) tools. This helps in extending these models to high-dimensional data, which is first of its kind among Python libraries. The library provides user-friendly model evaluation tools, such as BIC, AIC, and log-likelihood estimation. The source-code is licensed under MIT license and can be accessed at \url{https://github.com/kasakh/Mixture-Models}. The package is highly extensible, allowing users to incorporate new distributions and optimization techniques with ease. We conduct a large scale simulation to compare the performance of various gradient based approaches against Expectation Maximization on a wide range of settings and identify the corresponding best suited approach.
翻译:\texttt{Mixture-Models} 是一个开源的Python库,用于拟合高斯混合模型(GMM)及其变体,例如简约GMM、因子分析混合模型、MClust模型、学生t分布混合模型等。它通过自动微分(AD)工具,利用梯度下降和Newton-CG等多种一阶/二阶优化算法,简化了这些模型的实现与分析过程。这有助于将这些模型扩展至高维数据,在Python库中尚属首创。该库提供了用户友好的模型评估工具,如BIC、AIC和对数似然估计。源代码采用MIT许可证,可从 \url{https://github.com/kasakh/Mixture-Models} 获取。该包具有高度可扩展性,允许用户轻松集成新的分布和优化技术。我们通过大规模模拟,在多类设置下比较了多种基于梯度的方法与期望最大化算法的性能,并确定了相应最适用的方法。