Discrete random probability measures are central to Bayesian inference, particularly as priors for mixture modeling and clustering. A broad and unifying class is that of proper species sampling processes (SSPs), encompassing many Bayesian nonparametric priors. We show that any proper SSP admits an exact two-stage finite-mixture representation built from a latent truncation index and a simple reweighting of the atoms. For each realized truncation index, the representation has finitely many atoms, and averaging over the induced law of that index recovers the original SSP setwise. This yields at least two consequences: (i) an exact two-stage finite construction for arbitrary SSPs, without user-chosen truncation levels; and (ii) posterior inference in SSP mixture models via standard finite-mixture machinery, leading to tractable MCMC algorithms without ad hoc truncations. We explore these consequences by deriving explicit total-variation bounds for the approximation error when the truncation level is fixed, and by studying practical performance in mixture modeling, with emphasis on Dirichlet and geometric SSPs.
翻译:暂无翻译