In recent years, various methods have been proposed for mesh analysis, each offering distinct advantages and often excelling on different object classes. We present a novel Mixture of Experts (MoE) framework designed to harness the complementary strengths of these diverse approaches. We propose a new gate architecture that encourages each expert to specialise in the classes it excels in. Our design is guided by two key ideas: (1) random walks over the mesh surface effectively capture the regions that individual experts attend to, and (2) an attention mechanism that enables the gate to focus on the areas most informative for each expert's decision-making. To further enhance performance, we introduce a dynamic loss balancing scheme that adjusts a trade-off between diversity and similarity losses throughout the training, where diversity prompts expert specialization, and similarity enables knowledge sharing among the experts. Our framework achieves state-of-the-art results in mesh classification, retrieval, and semantic segmentation tasks. Our code is available at: https://github.com/amirbelder/MME-Mixture-of-Mesh-Experts.
翻译:近年来,针对网格分析已提出多种方法,每种方法都具有独特优势,且通常在不同物体类别上表现优异。我们提出一种新颖的专家混合框架,旨在综合利用这些多样化方法的互补优势。我们设计了一种新的门控架构,促使每个专家专注于其擅长的类别。我们的设计遵循两个核心思想:(1)网格表面的随机游走能有效捕捉个体专家关注的区域;(2)注意力机制使门控模块能够聚焦于对每个专家决策最具信息量的区域。为进一步提升性能,我们引入了动态损失平衡方案,在训练过程中动态调整多样性损失与相似性损失之间的权衡——多样性促进专家专业化,而相似性则支持专家间的知识共享。我们的框架在网格分类、检索和语义分割任务中取得了最先进的结果。代码已开源:https://github.com/amirbelder/MME-Mixture-of-Mesh-Experts。