There has emerged a growing interest in exploring efficient quality assessment algorithms for image super-resolution (SR). However, employing deep learning techniques, especially dual-branch algorithms, to automatically evaluate the visual quality of SR images remains challenging. Existing SR image quality assessment (IQA) metrics based on two-stream networks lack interactions between branches. To address this, we propose a novel full-reference IQA (FR-IQA) method for SR images. Specifically, producing SR images and evaluating how close the SR images are to the corresponding HR references are separate processes. Based on this consideration, we construct a deep Bi-directional Attention Network (BiAtten-Net) that dynamically deepens visual attention to distortions in both processes, which aligns well with the human visual system (HVS). Experiments on public SR quality databases demonstrate the superiority of our proposed BiAtten-Net over state-of-the-art quality assessment methods. In addition, the visualization results and ablation study show the effectiveness of bi-directional attention.
翻译:针对图像超分辨率(SR)的质量评估算法研究日益受到关注。然而,利用深度学习技术(特别是双分支算法)自动评估SR图像的视觉质量仍具挑战性。现有基于双流的SR图像质量评估(IQA)指标缺乏分支间的交互作用。为解决该问题,我们提出一种适用于SR图像的全参考IQA(FR-IQA)新方法。具体而言,生成SR图像与评估其与对应高分辨率(HR)参考图像的接近程度是两个相互独立的过程。基于此,我们构建了深度双向注意力网络(BiAtten-Net),该网络能动态强化这两个过程中对失真区域的视觉关注,与人眼视觉系统(HVS)特性高度契合。在公开SR质量数据库上的实验表明,所提BiAtten-Net方法优于当前最先进的质量评估方法。此外,可视化结果与消融研究验证了双向注意力的有效性。