Ensemble learning has proven effective in improving predictive performance and estimating uncertainty in neural networks. However, conventional ensemble methods often suffer from redundant parameter usage and computational inefficiencies due to entirely independent network training. To address these challenges, we propose the Divergent Ensemble Network (DEN), a novel architecture that combines shared representation learning with independent branching. DEN employs a shared input layer to capture common features across all branches, followed by divergent, independently trainable layers that form an ensemble. This shared-to-branching structure reduces parameter redundancy while maintaining ensemble diversity, enabling efficient and scalable learning.
翻译:集成学习已被证明能有效提升神经网络预测性能并估计不确定性。然而,传统集成方法因完全独立的网络训练而常存在参数冗余与计算效率低下的问题。为应对这些挑战,我们提出发散集成网络(DEN),一种结合共享表征学习与独立分支的新型架构。DEN采用共享输入层以捕获所有分支间的共同特征,随后连接可独立训练的发散层构成集成。这种从共享到分支的结构在保持集成多样性的同时减少了参数冗余,实现了高效且可扩展的学习。