This paper presents two novel algorithms for approximately projecting symmetric matrices onto the Positive Semidefinite (PSD) cone using Randomized Numerical Linear Algebra (RNLA). Classical PSD projection methods rely on full-rank deterministic eigen-decomposition, which can be computationally prohibitive for large-scale problems. Our approach leverages RNLA to construct low-rank matrix approximations before projection, significantly reducing the required numerical resources. The first algorithm utilizes random sampling to generate a low-rank approximation, followed by a standard eigen-decomposition on this smaller matrix. The second algorithm enhances this process by introducing a scaling approach that aligns the leading-order singular values with the positive eigenvalues, ensuring that the low-rank approximation captures the essential information about the positive eigenvalues for PSD projection. Both methods offer a trade-off between accuracy and computational speed, supported by probabilistic error bounds. To further demonstrate the practical benefits of our approach, we integrate the randomized projection methods into a first-order Semi-Definite Programming (SDP) solver. Numerical experiments, including those on SDPs derived from Sum-of-Squares (SOS) programming problems, validate the effectiveness of our method, especially for problems that are infeasible with traditional deterministic methods.
翻译:本文提出了两种利用随机数值线性代数(RNLA)将对称矩阵近似投影到正半定(PSD)锥的新算法。传统的PSD投影方法依赖于全秩确定性特征分解,对于大规模问题计算代价过高。我们的方法利用RNLA在投影前构建低秩矩阵近似,从而显著降低所需的数值计算资源。第一种算法通过随机采样生成低秩近似,随后对该较小矩阵执行标准特征分解。第二种算法通过引入缩放方法改进了这一过程,该方法将主导奇异值与正特征值对齐,确保低秩近似能捕获PSD投影所需的正特征值关键信息。两种方法均在概率误差界支持下实现了精度与计算速度的权衡。为进一步展示本方法的实际优势,我们将随机化投影方法集成至一阶半定规划(SDP)求解器中。数值实验(包括源自平方和(SOS)规划问题的SDP实验)验证了本方法的有效性,尤其适用于传统确定性方法无法求解的问题。