The generalized orthogonal Procrustes problem (GOPP) plays a fundamental role in several scientific disciplines including statistics, imaging science and computer vision. Despite its tremendous practical importance, it is generally an NP-hard problem to find the least squares estimator. We study the semidefinite relaxation (SDR) and an iterative method named generalized power method (GPM) to find the least squares estimator, and investigate the performance under a signal-plus-noise model. We show that the SDR recovers the least squares estimator exactly and moreover the generalized power method with a proper initialization converges linearly to the global minimizer to the SDR, provided that the signal-to-noise ratio is large. The main technique follows from showing the nonlinear mapping involved in the GPM is essentially a local contraction mapping and then applying the well-known Banach fixed-point theorem finishes the proof. In addition, we analyze the low-rank factorization algorithm and show the corresponding optimization landscape is free of spurious local minimizers under nearly identical conditions that enables the success of SDR approach. The highlight of our work is that the theoretical guarantees are purely algebraic and do not assume any statistical priors of the additive adversaries, and thus it applies to various interesting settings.
翻译:广义正交普洛克鲁斯问题(GOPP)在统计学、成像科学和计算机视觉等多个科学领域中具有基础性作用。尽管其实际意义重大,但寻找最小二乘估计量通常是一个NP难问题。我们研究了半定松弛(SDR)以及一种称为广义幂法(GPM)的迭代方法,以寻找最小二乘估计量,并在信号加噪声模型下分析了其性能。我们证明,在信噪比较高的情况下,SDR能够精确恢复最小二乘估计量,并且采用适当初始化的广义幂法能够线性收敛到SDR的全局极小值点。主要技术源于证明GPM中涉及的非线性映射本质上是一个局部压缩映射,随后应用著名的巴拿赫不动点定理完成证明。此外,我们分析了低秩分解算法,并表明在几乎相同的条件下,相应的优化景观不存在虚假局部极小值点,这保证了SDR方法的成功。我们工作的亮点在于理论保证是纯代数的,不假设加性对抗的任何统计先验,因此适用于各种有趣的设定。