Neural networks have long strived to emulate the learning capabilities of the human brain. While deep neural networks (DNNs) draw inspiration from the brain in neuron design, their training methods diverge from biological foundations. Backpropagation, the primary training method for DNNs, requires substantial computational resources and fully labeled datasets, presenting major bottlenecks in development and application. This work demonstrates that by returning to biomimicry, specifically mimicking how the brain learns through pruning, we can solve various classical machine learning problems while utilizing orders of magnitude fewer computational resources and no labels. Our experiments successfully personalized multiple speech recognition and image classification models, including ResNet50 on ImageNet, resulting in increased sparsity of approximately 70\% while simultaneously improving model accuracy to around 90\%, all without the limitations of backpropagation. This biologically inspired approach offers a promising avenue for efficient, personalized machine learning models in resource-constrained environments.
翻译:长期以来,神经网络一直致力于模拟人脑的学习能力。虽然深度神经网络(DNNs)在神经元设计上从大脑中汲取灵感,但其训练方法却偏离了生物学基础。反向传播作为DNNs的主要训练方法,需要大量的计算资源和完全标注的数据集,这构成了开发和应用中的主要瓶颈。本研究表明,通过回归仿生学,特别是模仿大脑通过剪枝进行学习的方式,我们能够解决各种经典的机器学习问题,同时使用数量级更少的计算资源且无需标注数据。我们的实验成功地对多个语音识别和图像分类模型进行了个性化定制,包括在ImageNet上的ResNet50,使模型稀疏度提高了约70%,同时将模型准确率提升至约90%,且完全不受反向传播的限制。这种受生物学启发的方法为在资源受限环境中实现高效的个性化机器学习模型提供了一条前景广阔的途径。