$C^*$-algebra-valued kernels could pave the way for the next generation of kernel machines. To further our fundamental understanding of learning with $C^*$-algebraic kernels, we propose a new class of positive definite kernels based on the spectral truncation. We focus on kernels whose inputs and outputs are vectors or functions and generalize typical kernels by introducing the noncommutativity of the products appearing in the kernels. The noncommutativity induces interactions along the data function domain. We show that it is a governing factor leading to performance enhancement: we can balance the representation power and the model complexity. We also propose a deep learning perspective to increase the representation capacity of spectral truncation kernels. The flexibility of the proposed class of kernels allows us to go beyond previous commutative kernels, addressing two of the foremost issues regarding learning in vector-valued RKHSs, namely the choice of the kernel and the computational cost.
翻译:$C^*$-代数值核可能为下一代核机开辟道路。为了深化对$C^*$-代数核学习的基本理解,我们提出了一类基于谱截断的新型正定核。我们关注输入和输出为向量或函数的核,并通过引入核中出现的乘积的非交换性来推广典型核。这种非交换性会沿数据函数域诱导相互作用。我们证明它是导致性能提升的主导因素:我们能够平衡表示能力与模型复杂度。我们还提出了深度学习视角以增强谱截断核的表示容量。所提出核类的灵活性使我们能够超越以往的交换核,从而解决向量值再生核希尔伯特空间学习中两个最关键的问题:核的选择与计算成本。