Sample selection models are a widely used approach for correcting bias caused by data that are missing not at random. Their formulation requires specifying the variables that influence the outcome and those that drive the selection process. This specification is often based on expert knowledge, which can result in the inclusion of irrelevant variables or the omission of important ones. Moreover, to avoid inferential problems such as practical non-identifiability, practitioners frequently impose exclusion restrictions, that is, model specifications in which certain variables predict selection but have no effect on the outcome of interest. A recent proposal employs adaptive LASSO to select the variables that enter into the outcome and selection equations, but its performance depends on the so-called covariance assumption, which can be violated in small to moderate samples. To address these challenges, we propose two families of spike-and-slab priors to conduct Bayesian variable selection in sample selection models. These prior structures allow for constructing a Gibbs sampler with tractable conditionals, which is scalable to the dimensions of practical interest. We illustrate the performance of the proposed methodology through a simulation study and present a comparison against adaptive LASSO and stepwise selection. We also provide two applications using publicly available real data.
翻译:暂无翻译