Performance indicators are essential tools for assessing the convergence behavior of multi-objective optimization algorithms, particularly when the true Pareto front is unknown or difficult to approximate. Classical reference-based metrics such as hypervolume and inverted generational distance are widely used, but may suffer from scalability limitations and sensitivity to parameter choices in many-objective scenarios. Indicators derived from Karush--Kuhn--Tucker (KKT) optimality conditions provide an intrinsic alternative by quantifying stationarity without relying on external reference sets. This paper revisits an entropy-inspired KKT-based convergence indicator and proposes a robust adaptive reformulation based on quantile normalization. The proposed indicator preserves the stationarity-based interpretation of the original formulation while improving robustness to heterogeneous distributions of stationarity residuals, a recurring issue in many-objective optimization.
翻译:性能指标是评估多目标优化算法收敛行为的重要工具,尤其是在真实帕累托前沿未知或难以逼近的情况下。经典的基于参考集的指标(如超体积和反向世代距离)被广泛使用,但在高维多目标场景中可能面临可扩展性限制及对参数选择的敏感性等问题。基于Karush--Kuhn--Tucker(KKT)最优性条件推导的指标提供了一种内在的替代方案,它通过量化驻点性而无需依赖外部参考集。本文重新审视了一种受熵启发的基于KKT条件的收敛性指标,并提出了一种基于分位数归一化的鲁棒自适应重构方法。所提出的指标在保留原公式基于驻点性解释的同时,增强了对驻点残差异质分布(这是高维多目标优化中反复出现的问题)的鲁棒性。