This paper presents DeepMTL2R, an open-source deep learning framework for Multi-task Learning to Rank (MTL2R), where multiple relevance criteria must be optimized simultaneously. DeepMTL2R integrates heterogeneous relevance signals into a unified, context-aware model by leveraging the self-attention mechanism of transformer architectures, enabling effective learning across diverse and potentially conflicting objectives. The framework includes 21 state-of-the-art multi-task learning algorithms and supports multi-objective optimization to identify Pareto-optimal ranking models. By capturing complex dependencies and long-range interactions among items and labels, DeepMTL2R provides a scalable and expressive solution for modern ranking systems and facilitates controlled comparisons across MTL strategies. We demonstrate its effectiveness on a publicly available dataset, report competitive performance, and visualize the resulting trade-offs among objectives. DeepMTL2R is available at \href{https://github.com/amazon-science/DeepMTL2R}{https://github.com/amazon-science/DeepMTL2R}.
翻译:本文介绍了DeepMTL2R,一个用于多任务排序学习的开源深度学习框架,该框架旨在同时优化多个相关性准则。DeepMTL2R通过利用Transformer架构的自注意力机制,将异构的相关性信号集成到一个统一的、上下文感知的模型中,从而能够在多样且可能相互冲突的目标之间进行有效学习。该框架包含了21种先进的多任务学习算法,并支持多目标优化以识别帕累托最优的排序模型。通过捕捉项目与标签之间复杂的依赖关系和长程交互,DeepMTL2R为现代排序系统提供了一个可扩展且表达能力强的解决方案,并促进了不同多任务学习策略之间的受控比较。我们在一个公开可用的数据集上验证了其有效性,报告了具有竞争力的性能,并对目标之间的权衡结果进行了可视化。DeepMTL2R可通过\href{https://github.com/amazon-science/DeepMTL2R}{https://github.com/amazon-science/DeepMTL2R}获取。