The Active Subspace (AS) method is a widely used technique for identifying the most influential directions in high-dimensional input spaces that affect the output of a computational model. The standard AS algorithm requires a sufficient number of gradient evaluations (samples) of the input output map to achieve quasi-optimal reconstruction of the active subspace, which can lead to a significant computational cost if the samples include numerical discretization errors which have to be kept sufficiently small. To address this issue, we propose a multilevel version of the Active Subspace method (MLAS) that utilizes samples computed with different accuracies and yields different active subspaces across accuracy levels, which can match the accuracy of single-level AS with reduced computational cost, making it suitable for downstream tasks such as function approximation. In particular, we propose to perform the latter via optimally-weighted least-squares polynomial approximation in the different active subspaces, and we present an adaptive algorithm to choose dynamically the dimensions of the active subspaces and polynomial spaces. We demonstrate the practical viability of the MLAS method with polynomial approximation through numerical experiments based on random partial differential equations (PDEs).
翻译:主动子空间(AS)方法是一种广泛使用的技术,用于识别高维输入空间中影响计算模型输出的最重要方向。标准AS算法需要足够数量的输入输出映射梯度评估(样本)以实现主动子空间的准最优重构,若样本包含必须保持足够小的数值离散误差,则可能导致显著的计算成本。为解决这一问题,我们提出了一种多级主动子空间方法(MLAS),该方法利用不同精度计算的样本,并在不同精度级别生成不同的主动子空间,从而能以降低的计算成本达到单级AS的精度,使其适用于函数逼近等下游任务。特别地,我们建议通过在不同主动子空间中进行最优加权最小二乘多项式逼近来实现后者,并提出一种自适应算法来动态选择主动子空间和多项式空间的维度。我们基于随机偏微分方程(PDE)的数值实验,通过多项式逼近验证了MLAS方法的实际可行性。