Hyper-heuristics have become a popular approach for solving dynamic flexible job shop scheduling (DFJSS) problems. They use gradient-free optimization techniques like Genetic Programming (GP) to evolve non-differentiable heuristics. However, conventional GP methods tend to converge slowly because they rely solely on evolutionary search to find good heuristics. Existing multitask GP methods can solve multiple tasks simultaneously and speed up the search by transferring knowledge across similar tasks. But they mostly exchange heuristic building blocks without truly generating heuristics conditioned on task information. In this paper, we aim to accelerate convergence and enable task-specific heuristic generation by incorporating a task-conditioned Transformer model. The Transformer works in two ways. First, it learns the distribution of elite heuristics, biasing the search toward promising regions of the heuristic space. Second, through conditional generation, it produces heuristics tailored to specific tasks, allowing the model to handle multiple scheduling tasks at once and improving overall optimization efficiency. Based on these ideas, we propose TransGP, a Task-Conditioned Transformer-Guided GP framework. This evolutionary paradigm integrates generative modeling with GP, enabling efficient multitask heuristic learning and knowledge transfer. We evaluate TransGP on a range of DFJSS scenarios. Experimental results show that TransGP consistently outperforms multitask GP baselines, widely used handcrafted heuristics, and the pure Transformer model, achieving faster convergence, superior solution quality, and enhanced robustness.
翻译:[translated abstract in Chinese]
超启发式方法已成为解决动态柔性作业车间调度(DFJSS)问题的流行方法。这类方法利用遗传规划(GP)等无梯度优化技术来演化非可微启发式规则。然而,传统GP方法通常收敛缓慢,因其完全依赖进化搜索寻找优质启发式规则。现有多任务GP方法虽能同时求解多个任务并通过跨相似任务的知识迁移加速搜索,但大多仅交换启发式构建模块,未能真正基于任务信息生成定制化启发式规则。本文通过引入任务条件化Transformer模型,旨在加速收敛并实现任务特定启发式规则的生成。该Transformer以双重机制运作:首先学习精英启发式规则的分布,引导搜索偏向启发式空间中的有前景区域;其次通过条件化生成机制,产出适配特定任务的启发式规则,使模型能同时处理多个调度任务,从而提升整体优化效率。基于上述思想,我们提出TransGP——一种任务条件化Transformer引导的GP框架。该进化范式将生成式建模与GP相融合,实现了高效的多任务启发式学习与知识迁移。我们在多种DFJSS场景下评估了TransGP。实验结果表明,TransGP在多任务GP基线方法、广泛使用的人工设计启发式规则以及纯Transformer模型上均展现出一致性优势,实现了更快的收敛速度、更优的求解质量与更强的鲁棒性。