When trained on large-scale object classification datasets, certain artificial neural network models begin to approximate core object recognition behaviors and neural response patterns in the primate brain. While recent machine learning advances suggest that scaling compute, model size, and dataset size improves task performance, the impact of scaling on brain alignment remains unclear. In this study, we explore scaling laws for modeling the primate visual ventral stream by systematically evaluating over 600 models trained under controlled conditions on benchmarks spanning V1, V2, V4, IT and behavior. We find that while behavioral alignment continues to scale with larger models, neural alignment saturates. This observation remains true across model architectures and training datasets, even though models with stronger inductive biases and datasets with higher-quality images are more compute-efficient. Increased scaling is especially beneficial for higher-level visual areas, where small models trained on few samples exhibit only poor alignment. Our results suggest that while scaling current architectures and datasets might suffice for alignment with human core object recognition behavior, it will not yield improved models of the brain's visual ventral stream, highlighting the need for novel strategies in building brain models.


翻译:当在大规模物体分类数据集上进行训练时,某些人工神经网络模型开始近似灵长类大脑的核心物体识别行为与神经响应模式。尽管近期机器学习进展表明,扩大计算量、模型规模与数据集规模可提升任务性能,但缩放对大脑对齐的影响尚不明确。本研究通过系统评估超过600个在受控条件下训练的模型(涵盖V1、V2、V4、IT及行为基准测试),探索了灵长类视觉腹侧通路建模的缩放定律。我们发现,虽然行为对齐随模型规模扩大持续提升,但神经对齐会趋于饱和。这一现象在不同模型架构与训练数据集中均成立,尽管具有更强归纳偏置的模型和更高质量图像的数据集具有更高的计算效率。扩大规模对高级视觉区域尤其有益,因为仅用少量样本训练的小模型在这些区域表现出较差的对齐性。我们的结果表明,尽管扩展当前架构与数据集可能足以实现与人类核心物体识别行为的对齐,但这并不会产生更优的大脑视觉腹侧通路模型,从而凸显了构建大脑模型时需要新策略的必要性。

0
下载
关闭预览

相关内容

ACM/IEEE第23届模型驱动工程语言和系统国际会议,是模型驱动软件和系统工程的首要会议系列,由ACM-SIGSOFT和IEEE-TCSE支持组织。自1998年以来,模型涵盖了建模的各个方面,从语言和方法到工具和应用程序。模特的参加者来自不同的背景,包括研究人员、学者、工程师和工业专业人士。MODELS 2019是一个论坛,参与者可以围绕建模和模型驱动的软件和系统交流前沿研究成果和创新实践经验。今年的版本将为建模社区提供进一步推进建模基础的机会,并在网络物理系统、嵌入式系统、社会技术系统、云计算、大数据、机器学习、安全、开源等新兴领域提出建模的创新应用以及可持续性。 官网链接:http://www.modelsconference.org/
Top
微信扫码咨询专知VIP会员