Ensemble learning has proven effective in improving predictive performance and estimating uncertainty in neural networks. However, conventional ensemble methods often suffer from redundant parameter usage and computational inefficiencies due to entirely independent network training. To address these challenges, we propose the Divergent Ensemble Network (DEN), a novel architecture that combines shared representation learning with independent branching. DEN employs a shared input layer to capture common features across all branches, followed by divergent, independently trainable layers that form an ensemble. This shared-to-branching structure reduces parameter redundancy while maintaining ensemble diversity, enabling efficient and scalable learning.
翻译:集成学习在提升神经网络预测性能与不确定性估计方面已被证明是有效方法。然而,传统集成方法由于完全独立的网络训练,常面临参数冗余和计算效率低下的问题。为应对这些挑战,我们提出发散集成网络(DEN),一种结合共享表示学习与独立分支的新型架构。DEN采用共享输入层来捕获所有分支间的共同特征,随后连接发散且可独立训练的各层以形成集成。这种从共享到分支的结构减少了参数冗余,同时保持了集成多样性,从而实现了高效且可扩展的学习。