Microcanonical gradient descent is a sampling procedure for energy-based models allowing for efficient sampling of distributions in high dimension. It works by transporting samples from a high-entropy distribution, such as Gaussian white noise, to a low-energy region using gradient descent. We put this model in the framework of normalizing flows, showing how it can often overfit by losing an unnecessary amount of entropy in the descent. As a remedy, we propose a mean-field microcanonical gradient descent that samples several weakly coupled data points simultaneously, allowing for better control of the entropy loss while paying little in terms of likelihood fit. We study these models in the context of financial time series, illustrating the improvements on both synthetic and real data.
翻译:微正则梯度下降是一种基于能量模型的采样方法,能够高效地对高维分布进行采样。该方法通过梯度下降将高熵分布(如高斯白噪声)的样本传输至低能量区域。我们将该模型置于标准化流框架中,证明其在下降过程中常因损失不必要的熵量而导致过拟合。作为改进方案,我们提出平均场微正则梯度下降法,该方法可同时采样多个弱耦合数据点,在几乎不影响似然拟合的前提下实现对熵损失更优的控制。我们在金融时间序列背景下研究这些模型,并通过合成数据与真实数据验证其改进效果。