Mixture-of-Experts (MOE) has recently become the de facto standard in Multi-domain recommendation (MDR) due to its powerful expressive ability. However, such MOE-based method typically employs all experts for each instance, leading to scalability issue and low-discriminability between domains and experts. Furthermore, the design of commonly used domain-specific networks exacerbates the scalability issues. To tackle the problems, We propose a novel method named CESAA consists of Conditional Expert Selection (CES) Module and Adaptive Expert Aggregation (AEA) Module to tackle these challenges. Specifically, CES first combines a sparse gating strategy with domain-shared experts. Then AEA utilizes mutual information loss to strengthen the correlations between experts and specific domains, and significantly improve the distinction between experts. As a result, only domain-shared experts and selected domain-specific experts are activated for each instance, striking a balance between computational efficiency and model performance. Experimental results on both public ranking and industrial retrieval datasets verify the effectiveness of our method in MDR tasks.
翻译:混合专家模型因其强大的表达能力,已成为多领域推荐领域的事实标准。然而,此类基于MOE的方法通常为每个实例调用所有专家,导致可扩展性问题以及领域与专家间的区分度不足。此外,常用的领域专用网络设计进一步加剧了可扩展性挑战。为解决这些问题,我们提出了一种名为CESAA的新方法,该方法包含条件专家选择模块与自适应专家聚合模块。具体而言,CES模块首先将稀疏门控策略与领域共享专家相结合;随后AEA模块利用互信息损失强化专家与特定领域间的关联性,显著提升专家间的区分度。最终,每个实例仅激活领域共享专家及被选中的领域专用专家,在计算效率与模型性能间实现了平衡。在公开排序数据集与工业检索数据集上的实验结果验证了本方法在多领域推荐任务中的有效性。