Graph-based multi-task learning at billion-scale presents a significant challenge, as different tasks correspond to distinct billion-scale graphs. Traditional multi-task learning methods often neglect these graph structures, relying solely on individual user and item embeddings. However, disregarding graph structures overlooks substantial potential for improving performance. In this paper, we introduce the Macro Graph of Experts (MGOE) framework, the first approach capable of leveraging macro graph embeddings to capture task-specific macro features while modeling the correlations between task-specific experts. Specifically, we propose the concept of a Macro Graph Bottom, which, for the first time, enables multi-task learning models to incorporate graph information effectively. We design the Macro Prediction Tower to dynamically integrate macro knowledge across tasks. MGOE has been deployed at scale, powering multi-task learning for a leading billion-scale recommender system, Alibaba. Extensive offline experiments conducted on three public benchmark datasets demonstrate its superiority over state-of-the-art multi-task learning methods, establishing MGOE as a breakthrough in multi-task graph-based recommendation. Furthermore, online A/B tests confirm the superiority of MGOE in billion-scale recommender systems.
翻译:基于图的十亿级多任务学习面临重大挑战,因为不同任务对应着不同的十亿级图。传统的多任务学习方法往往忽略这些图结构,仅依赖独立的用户和物品嵌入。然而,忽视图结构意味着错失了提升性能的巨大潜力。本文提出宏观专家图框架,这是首个能够利用宏观图嵌入来捕获任务特定宏观特征,同时建模任务特定专家间相关性的方法。具体而言,我们提出了宏观图底层的概念,首次使多任务学习模型能够有效整合图信息。我们设计了宏观预测塔来动态整合跨任务的宏观知识。MGOE已实现大规模部署,为领先的十亿级推荐系统阿里巴巴提供了多任务学习支持。在三个公开基准数据集上进行的大量离线实验证明了其优于现有最先进的多任务学习方法,确立了MGOE在图基多任务推荐领域的突破性地位。此外,在线A/B测试也证实了MGOE在十亿级推荐系统中的优越性。