In offline multi-objective optimization (MOO), we leverage an offline dataset of designs and their associated labels to simultaneously minimize multiple objectives. This setting more closely mirrors complex real-world problems compared to single-objective optimization. Recent works mainly employ evolutionary algorithms and Bayesian optimization, with limited attention given to the generative modeling capabilities inherent in such data. In this study, we explore generative modeling in offline MOO through flow matching, noted for its effectiveness and efficiency. We introduce ParetoFlow, specifically designed to guide flow sampling to approximate the Pareto front. Traditional predictor (classifier) guidance is inadequate for this purpose because it models only a single objective. In response, we propose a multi-objective predictor guidance module that assigns each sample a weight vector, representing a weighted distribution across multiple objective predictions. A local filtering scheme is introduced to address non-convex Pareto fronts. These weights uniformly cover the entire objective space, effectively directing sample generation towards the Pareto front. Since distributions with similar weights tend to generate similar samples, we introduce a neighboring evolution module to foster knowledge sharing among neighboring distributions. This module generates offspring from these distributions, and selects the most promising one for the next iteration. Our method achieves state-of-the-art performance across various tasks.
翻译:在离线多目标优化(MOO)中,我们利用包含设计方案及其对应标签的离线数据集来同时最小化多个目标。与单目标优化相比,这一设定更贴近复杂的现实世界问题。现有研究主要采用进化算法和贝叶斯优化方法,对数据中固有的生成建模能力关注有限。本研究通过流匹配方法探索离线MOO中的生成建模,该方法以高效性和有效性著称。我们提出ParetoFlow模型,其专为引导流采样以逼近帕累托前沿而设计。传统的预测器(分类器)引导机制因仅建模单一目标而不适用于此场景。为此,我们提出多目标预测器引导模块,该模块为每个样本分配权重向量,表征多个目标预测的加权分布。针对非凸帕累托前沿问题,我们引入了局部过滤机制。这些权重向量均匀覆盖整个目标空间,能有效引导样本生成朝向帕累托前沿。由于具有相似权重的分布倾向于生成相似样本,我们提出邻域进化模块以促进相邻分布间的知识共享。该模块从这些分布中生成子代样本,并选择最具潜力的样本进入下一轮迭代。我们的方法在多项任务中均取得了最先进的性能表现。