This work characterizes equivariant polynomial functions from tuples of tensor inputs to tensor outputs. Loosely motivated by physics, we focus on equivariant functions with respect to the diagonal action of the orthogonal group on tensors. We show how to extend this characterization to other linear algebraic groups, including the Lorentz and symplectic groups. Our goal behind these characterizations is to define equivariant machine learning models. In particular, we focus on the sparse vector estimation problem. This problem has been broadly studied in the theoretical computer science literature, and explicit spectral methods, derived by techniques from sum-of-squares, can be shown to recover sparse vectors under certain assumptions. Our numerical results show that the proposed equivariant machine learning models can learn spectral methods that outperform the best theoretically known spectral methods in some regimes. The experiments also suggest that learned spectral methods can solve the problem in settings that have not yet been theoretically analyzed. This is an example of a promising direction in which theory can inform machine learning models and machine learning models could inform theory.
翻译:本研究刻画了从张量元组输入到张量输出的等变多项式函数。受物理学启发,我们重点关注正交群在张量上对角作用下的等变函数。我们展示了如何将这种刻画推广到其他线性代数群,包括洛伦兹群和辛群。这些刻画背后的目标是定义等变的机器学习模型。具体而言,我们聚焦于稀疏向量估计问题。该问题在理论计算机科学文献中已被广泛研究,通过平方和技巧推导出的显式谱方法,可在特定假设下被证明能够恢复稀疏向量。我们的数值结果表明,所提出的等变机器学习模型能够学习在某些机制下超越理论已知最佳谱方法的谱方法。实验还表明,学习得到的谱方法能够在尚未被理论分析的场景中解决该问题。这展示了一个前景广阔的研究方向:理论可指导机器学习模型,而机器学习模型亦可反哺理论发展。