Deformable image registration is crucial for aligning medical images in a non-linear fashion across different modalities, allowing for precise spatial correspondence between varying anatomical structures. This paper presents NestedMorph, a novel network utilizing a Nested Attention Fusion approach to improve intra-subject deformable registration between T1-weighted (T1w) MRI and diffusion MRI (dMRI) data. NestedMorph integrates high-resolution spatial details from an encoder with semantic information from a decoder using a multi-scale framework, enhancing both local and global feature extraction. Our model notably outperforms existing methods, including CNN-based approaches like VoxelMorph, MIDIR, and CycleMorph, as well as Transformer-based models such as TransMorph and ViT-V-Net, and traditional techniques like NiftyReg and SyN. Evaluations on the HCP dataset demonstrate that NestedMorph achieves superior performance across key metrics, including SSIM, HD95, and SDlogJ, with the highest SSIM of 0.89, and the lowest HD95 of 2.5 and SDlogJ of 0.22. These results highlight NestedMorph's ability to capture both local and global image features effectively, leading to superior registration performance. The promising outcomes of this study underscore NestedMorph's potential to significantly advance deformable medical image registration, providing a robust framework for future research and clinical applications. The source code and our implementation are available at: https://bit.ly/3zdVqcg
翻译:可变形图像配准对于以非线性方式对齐不同模态的医学图像至关重要,它能在变化的解剖结构之间建立精确的空间对应关系。本文提出NestedMorph,一种利用嵌套注意力融合方法的新型网络,旨在改进T1加权(T1w)磁共振成像与弥散磁共振成像(dMRI)数据之间的受试者内可变形配准。NestedMorph通过多尺度框架,将编码器的高分辨率空间细节与解码器的语义信息相融合,从而增强了局部和全局特征提取能力。我们的模型显著优于现有方法,包括基于CNN的方法(如VoxelMorph、MIDIR和CycleMorph)、基于Transformer的模型(如TransMorph和ViT-V-Net)以及传统技术(如NiftyReg和SyN)。在HCP数据集上的评估表明,NestedMorph在关键指标上均取得了优异性能,包括SSIM、HD95和SDlogJ,其中SSIM最高达0.89,HD95最低为2.5,SDlogJ最低为0.22。这些结果突显了NestedMorph有效捕捉局部和全局图像特征的能力,从而实现了卓越的配准性能。本研究取得的良好成果强调了NestedMorph在显著推进可变形医学图像配准方面的潜力,为未来研究和临床应用提供了一个稳健的框架。源代码及我们的实现可在以下网址获取:https://bit.ly/3zdVqcg