Accounting for inter-individual variability in brain function is key to precision medicine. Here, by considering functional inter-individual variability as meaningful data rather than noise, we introduce VarCoNet, an enhanced self-supervised framework for robust functional connectome (FC) extraction from resting-state fMRI (rs-fMRI) data. VarCoNet employs self-supervised contrastive learning to exploit inherent functional inter-individual variability, serving as a brain function encoder that generates FC embeddings readily applicable to downstream tasks even in the absence of labeled data. Contrastive learning is facilitated by a novel augmentation strategy based on segmenting rs-fMRI signals. At its core, VarCoNet integrates a 1D-CNN-Transformer encoder for advanced time-series processing, enhanced with a robust Bayesian hyperparameter optimization. Our VarCoNet framework is evaluated on two downstream tasks: (i) subject fingerprinting, using rs-fMRI data from the Human Connectome Project, and (ii) autism spectrum disorder (ASD) classification, using rs-fMRI data from the ABIDE I and ABIDE II datasets. Using different brain parcellations, our extensive testing against state-of-the-art methods, including 13 deep learning methods, demonstrates VarCoNet's superiority, robustness, interpretability, and generalizability. Overall, VarCoNet provides a versatile and robust framework for FC analysis in rs-fMRI.
翻译:考虑大脑功能的个体间变异性是精准医学的关键。本文通过将功能个体间变异性视为有意义的数据而非噪声,提出了VarCoNet——一种增强的自监督框架,用于从静息态功能磁共振成像数据中稳健地提取功能连接组。VarCoNet采用自监督对比学习来利用固有的功能个体间变异性,作为一个大脑功能编码器,即使在缺乏标注数据的情况下,也能生成可直接应用于下游任务的功能连接组嵌入。对比学习通过一种基于静息态功能磁共振信号分割的新型数据增强策略得以实现。其核心在于,VarCoNet集成了一个1D-CNN-Transformer编码器用于高级时间序列处理,并辅以稳健的贝叶斯超参数优化进行增强。我们在两个下游任务上评估了VarCoNet框架:(i)受试者指纹识别,使用来自人类连接组计划的静息态功能磁共振数据;(ii)自闭症谱系障碍分类,使用来自ABIDE I和ABIDE II数据集的静息态功能磁共振数据。通过使用不同的脑区划分图谱,我们与包括13种深度学习方法在内的先进方法进行了广泛测试,结果证明了VarCoNet的优越性、稳健性、可解释性和泛化能力。总体而言,VarCoNet为静息态功能磁共振成像中的功能连接组分析提供了一个通用且稳健的框架。