Transformers have revolutionized medical image restoration, but the quadratic complexity still poses limitations for their application to high-resolution medical images. The recent advent of RWKV in the NLP field has attracted much attention as it can process long sequences efficiently. To leverage its advanced design, we propose Restore-RWKV, the first RWKV-based model for medical image restoration. Since the original RWKV model is designed for 1D sequences, we make two necessary modifications for modeling spatial relations in 2D images. First, we present a recurrent WKV (Re-WKV) attention mechanism that captures global dependencies with linear computational complexity. Re-WKV incorporates bidirectional attention as basic for a global receptive field and recurrent attention to effectively model 2D dependencies from various scan directions. Second, we develop an omnidirectional token shift (Omni-Shift) layer that enhances local dependencies by shifting tokens from all directions and across a wide context range. These adaptations make the proposed Restore-RWKV an efficient and effective model for medical image restoration. Extensive experiments demonstrate that Restore-RWKV achieves superior performance across various medical image restoration tasks, including MRI image super-resolution, CT image denoising, PET image synthesis, and all-in-one medical image restoration. Code is available at: \href{https://github.com/Yaziwel/Restore-RWKV.git}{https://github.com/Yaziwel/Restore-RWKV}.
翻译:Transformer模型已经彻底改变了医学图像复原领域,但其二次复杂度在处理高分辨率医学图像时仍存在应用限制。近期,RWKV在自然语言处理领域的出现因其能够高效处理长序列而备受关注。为了利用其先进设计,我们提出了Restore-RWKV,这是首个基于RWKV的医学图像复原模型。由于原始RWKV模型专为一维序列设计,我们针对二维图像的空间关系建模进行了两项必要改进。首先,我们提出了一种循环WKV(Re-WKV)注意力机制,该机制能以线性计算复杂度捕获全局依赖关系。Re-WKV融合了双向注意力作为全局感受野的基础,并通过循环注意力有效建模来自不同扫描方向的二维依赖关系。其次,我们开发了全向令牌移位(Omni-Shift)层,该层通过全方位、宽上下文范围的令牌移位来增强局部依赖关系。这些改进使得所提出的Restore-RWKV成为医学图像复原领域高效且有效的模型。大量实验表明,Restore-RWKV在多种医学图像复原任务中均取得卓越性能,包括MRI图像超分辨率、CT图像去噪、PET图像合成以及一体化医学图像复原。代码发布于:\href{https://github.com/Yaziwel/Restore-RWKV.git}{https://github.com/Yaziwel/Restore-RWKV}。