$C^*$-algebra-valued kernels could pave the way for the next generation of kernel machines. To further our fundamental understanding of learning with $C^*$-algebraic kernels, we propose a new class of positive definite kernels based on the spectral truncation. We focus on kernels whose inputs and outputs are vectors or functions and generalize typical kernels by introducing the noncommutativity of the products appearing in the kernels. The noncommutativity induces interactions along the data function domain. We show that the proposed kernels fill the gap between existing separable and commutative kernels. We also propose a deep learning perspective to obtain a more flexible framework. The flexibility of the proposed class of kernels allows us to go beyond previous separable and commutative kernels, addressing two of the foremost issues regarding learning in vector-valued RKHSs, namely the choice of the kernel and the computational cost.
翻译:$C^*$-代数值核可能为下一代核机器铺平道路。为了深化对$C^*$-代数核学习的基本理解,我们基于谱截断提出了一类新的正定核。我们关注输入和输出为向量或函数的核,并通过引入核中出现的乘积的非交换性来推广典型核。这种非交换性会沿着数据函数域诱导相互作用。我们证明了所提出的核填补了现有可分离核与交换核之间的空白。我们还提出了一个深度学习视角以获得更灵活的框架。所提出核类的灵活性使我们能够超越先前的可分离核与交换核,从而解决向量值再生核希尔伯特空间学习中两个最关键的问题:核的选择与计算成本。