We propose a new ensemble framework for supervised learning, called machine collaboration (MaC), using a collection of base machines for prediction tasks. Unlike bagging/stacking (a parallel & independent framework) and boosting (a sequential & top-down framework), MaC is a type of circular & interactive learning framework. The circular & interactive feature helps the base machines to transfer information circularly and update their structures and parameters accordingly. The theoretical result on the risk bound of the estimator from MaC reveals that the circular & interactive feature can help MaC reduce risk via a parsimonious ensemble. We conduct extensive experiments on MaC using both simulated data and 119 benchmark real datasets. The results demonstrate that in most cases, MaC performs significantly better than several other state-of-the-art methods, including classification and regression trees, neural networks, stacking, and boosting.
翻译:我们提出了一种新的监督集成学习框架,称为机器协作(MaC),它利用一组基学习器进行预测任务。与装袋/堆叠(一种并行且独立的框架)和提升(一种顺序且自上而下的框架)不同,MaC是一种循环且交互式的学习框架。这种循环交互特性有助于基学习器循环传递信息,并相应更新其结构和参数。关于MaC估计量风险界限的理论结果表明,循环交互特性能够通过简约集成帮助MaC降低风险。我们使用模拟数据和119个基准真实数据集对MaC进行了广泛实验。结果表明,在大多数情况下,MaC的性能显著优于其他几种最先进方法,包括分类回归树、神经网络、堆叠和提升方法。