We study the self-normalized concentration of vector-valued stochastic processes. We focus on bounds for "sub-$ψ$" processes, a well-known and quite general class of process that encompasses a wide variety of well-known tail conditions (including sub-exponential, sub-Gaussian, sub-gamma, sub-Poisson, and several heavy-tailed settings without a moment generating function such as symmetric or bounded 2nd or 3rd moments). Our results recover and generalize the influential bound of de la Peña et al. [20] (proved again in Abbasi-Yadkori et al. [2]) in the sub-Gaussian case. Further, we fill a gap in the literature between determinant-based bounds and more recent bounds based on condition numbers. As applications we prove a Bernstein inequality for random vectors satisfying a moment condition (a more general condition than boundedness), and also provide the first dimension-free self-normalized empirical Bernstein inequality. Our techniques are based on the variational (PAC-Bayes) approach to concentration.
翻译:我们研究了向量值随机过程的自归一化集中性。我们重点关注"次-ψ"过程的界,这是一类众所周知且相当通用的过程,涵盖了多种著名的尾部条件(包括次指数、次高斯、次伽马、次泊松,以及若干不存在矩母函数的厚尾场景,如具有对称或有界二阶或三阶矩的情形)。我们的结果恢复并推广了de la Peña等人[20](后由Abbasi-Yadkori等人[2]再次证明)在次高斯情形下的重要界。此外,我们填补了文献中基于行列式的界与近期基于条件数的界之间的空白。作为应用,我们证明了满足矩条件(比有界性更一般的条件)的随机向量的伯恩斯坦不等式,并首次给出了无维度自归一化经验伯恩斯坦不等式。我们的技术基于集中性的变分(PAC-Bayes)方法。