We present a novel framework for learning system design with neural feature extractors. First, we introduce the feature geometry, which unifies statistical dependence and feature representations in a function space equipped with inner products. This connection defines function-space concepts on statistical dependence, such as norms, orthogonal projection, and spectral decomposition, exhibiting clear operational meanings. In particular, we associate each learning setting with a dependence component and formulate learning tasks as finding corresponding feature approximations. We propose a nesting technique, which provides systematic algorithm designs for learning the optimal features from data samples with off-the-shelf network architectures and optimizers. We further demonstrate multivariate learning applications, including conditional inference and multimodal learning, where we present the optimal features and reveal their connections to classical approaches.
翻译:我们提出了一种利用神经特征提取器进行学习系统设计的新框架。首先,我们引入特征几何,它在配备内积的函数空间中统一了统计依赖性与特征表示。这一联系定义了统计依赖性上的函数空间概念,如范数、正交投影和谱分解,并展现出清晰的操作意义。特别地,我们将每个学习设置与一个依赖性分量相关联,并将学习任务表述为寻找相应的特征近似。我们提出了一种嵌套技术,该技术为从数据样本中学习最优特征提供了系统化的算法设计,并可直接使用现成的网络架构和优化器。我们进一步展示了多元学习应用,包括条件推理和多模态学习,其中我们给出了最优特征并揭示了它们与经典方法的联系。