Supervised learning in reproducing kernel Hilbert space (RKHS) and vector-valued RKHS (vvRKHS) has been investigated for more than 30 years. In this paper, we provide a new twist to this rich literature by generalizing supervised learning in RKHS and vvRKHS to reproducing kernel Hilbert $C^*$-module (RKHM), and show how to construct effective positive-definite kernels by considering the perspective of $C^*$-algebra. Unlike the cases of RKHS and vvRKHS, we can use $C^*$-algebras to enlarge representation spaces. This enables us to construct RKHMs whose representation power goes beyond RKHSs, vvRKHSs, and existing methods such as convolutional neural networks. Our framework is suitable, for example, for effectively analyzing image data by allowing the interaction of Fourier components.
翻译:在再生核希尔伯特空间(RKHS)与向量值再生核希尔伯特空间(vvRKHS)中的监督学习已有超过三十年的研究历史。本文通过将RKHS与vvRKHS中的监督学习推广至再生核希尔伯特$C^*$-模(RKHM),为这一丰富的研究领域提供了新的理论视角,并展示了如何从$C^*$-代数的角度构建有效的正定核。与RKHS和vvRKHS的情形不同,我们可以利用$C^*$-代数来扩展表示空间。这使得我们能够构建表示能力超越RKHS、vvRKHS以及现有方法(如卷积神经网络)的RKHM。例如,我们的框架通过允许傅里叶分量的相互作用,特别适用于对图像数据进行高效分析。