Multi-domain recommendation and multi-task recommendation have demonstrated their effectiveness in leveraging common information from different domains and objectives for comprehensive user modeling. Nonetheless, the practical recommendation usually faces multiple domains and tasks simultaneously, which cannot be well-addressed by current methods. To this end, we introduce M3oE, an adaptive Multi-domain Multi-task Mixture-of-Experts recommendation framework. M3oE integrates multi-domain information, maps knowledge across domains and tasks, and optimizes multiple objectives. We leverage three mixture-of-experts modules to learn common, domain-aspect, and task-aspect user preferences respectively to address the complex dependencies among multiple domains and tasks in a disentangled manner. Additionally, we design a two-level fusion mechanism for precise control over feature extraction and fusion across diverse domains and tasks. The framework's adaptability is further enhanced by applying AutoML technique, which allows dynamic structure optimization. To the best of the authors' knowledge, our M3oE is the first effort to solve multi-domain multi-task recommendation self-adaptively. Extensive experiments on two benchmark datasets against diverse baselines demonstrate M3oE's superior performance. The implementation code is available to ensure reproducibility.
翻译:多域推荐与多任务推荐已被证明能有效利用来自不同领域和目标的共同信息进行全面的用户建模。然而,实际推荐场景通常同时面临多个领域和任务,现有方法难以妥善解决这一问题。为此,我们提出M3oE——一种自适应的多域多任务混合专家推荐框架。M3oE整合多域信息、跨领域与任务映射知识,并优化多个目标。我们利用三个混合专家模块分别学习通用、领域层面和任务层面的用户偏好,以解耦方式处理多域与多任务间的复杂依赖关系。此外,我们设计了两级融合机制,实现对不同领域和任务的特征提取与融合的精准控制。通过应用自动机器学习技术进一步增强框架的自适应性,实现动态结构优化。据作者所知,M3oE是首个自适应解决多域多任务推荐问题的尝试。在两个基准数据集上与多种基线方法进行的大量实验表明,M3oE具有优越性能。实现代码已公开以保障可复现性。