Variable selection remains a difficult problem, especially for generalized linear mixed models (GLMMs). While some frequentist approaches to simultaneously select joint fixed and random effects exist, primarily through the use of penalization, existing approaches for Bayesian GLMMs exist only for special cases, like that of logistic regression. In this work, we apply the Stochastic Search Variable Selection (SSVS) approach for the joint selection of fixed and random effects proposed in Yang et al. (2020) for linear mixed models to Bayesian GLMMs. We show that while computational issues remain, SSVS serves as a feasible and effective approach to jointly select fixed and random effects. We demonstrate the effectiveness of the proposed methodology to both simulated and real data. Furthermore, we study the role hyperparameters play in the model selection.
翻译:变量选择仍然是一个难题,尤其对于广义线性混合模型(GLMMs)。虽然存在一些频率学方法可以同时选择联合固定效应和随机效应(主要通过惩罚化手段实现),但现有的贝叶斯GLMMs变量选择方法仅适用于特殊情形(如逻辑回归)。在本研究中,我们将Yang等人(2020)针对线性混合模型提出的联合选择固定效应与随机效应的随机搜索变量选择(SSVS)方法拓展至贝叶斯GLMMs。我们证明,尽管计算问题依然存在,SSVS仍可作为联合选择固定效应与随机效应的可行且有效的方法。我们通过模拟数据和真实数据验证了所提方法的有效性。此外,我们还研究了超参数在模型选择中的作用。