Regression on function spaces is typically limited to models with Gaussian process priors. We introduce the notion of universal functional regression, in which we aim to learn a prior distribution over non-Gaussian function spaces that remains mathematically tractable for functional regression. To do this, we develop Neural Operator Flows (OpFlow), an infinite-dimensional extension of normalizing flows. OpFlow is an invertible operator that maps the (potentially unknown) data function space into a Gaussian process, allowing for exact likelihood estimation of functional point evaluations. OpFlow enables robust and accurate uncertainty quantification via drawing posterior samples of the Gaussian process and subsequently mapping them into the data function space. We empirically study the performance of OpFlow on regression and generation tasks with data generated from Gaussian processes with known posterior forms and non-Gaussian processes, as well as real-world earthquake seismograms with an unknown closed-form distribution.
翻译:函数空间上的回归通常局限于具有高斯过程先验的模型。我们引入了通用函数回归的概念,其目标是学习一个在非高斯函数空间上的先验分布,该分布在数学上仍适用于函数回归。为此,我们提出了神经算子流(OpFlow),这是标准化流在无限维空间上的扩展。OpFlow是一个可逆算子,它将(可能未知的)数据函数空间映射到高斯过程,从而允许对函数点评估进行精确的似然估计。OpFlow通过从高斯过程抽取后验样本,随后将其映射回数据函数空间,实现了鲁棒且准确的不确定性量化。我们在回归和生成任务上实证研究了OpFlow的性能,所用数据包括已知后验形式的高斯过程、非高斯过程生成的数据,以及具有未知闭式分布的真实世界地震震波图。