Nonsmooth composite optimization with orthogonality constraints is crucial in statistical learning and data science, but it presents challenges due to its nonsmooth objective and computationally expensive, non-convex constraints. In this paper, we propose a new approach called \textbf{OBCD}, which leverages Block Coordinate Descent (BCD) to address these challenges. \textbf{OBCD} is a feasible method with a small computational footprint. In each iteration, it updates $k$ rows of the solution matrix, where $k \geq 2$, while globally solving a small nonsmooth optimization problem under orthogonality constraints. We prove that \textbf{OBCD} converges to block-$k$ stationary points, which offer stronger optimality than standard critical points. Notably, \textbf{OBCD} is the first greedy descent method with monotonicity for this problem class. Under the Kurdyka-Lojasiewicz (KL) inequality, we establish strong limit-point convergence. We also extend \textbf{OBCD} with breakpoint searching methods for subproblem solving and greedy strategies for working set selection. Comprehensive experiments demonstrate the superior performance of our approach across various tasks.
翻译:正交约束下的非光滑复合优化在统计学习和数据科学中至关重要,但由于其目标函数非光滑且约束条件计算代价高昂、非凸,带来了诸多挑战。本文提出一种名为 **OBCD** 的新方法,该方法利用块坐标下降(BCD)来应对这些挑战。**OBCD** 是一种计算开销较小的可行方法。在每次迭代中,它更新解矩阵的 $k$ 行(其中 $k \geq 2$),同时全局求解一个正交约束下的小规模非光滑优化问题。我们证明了 **OBCD** 收敛于块-$k$ 稳定点,该点比标准临界点具有更强的优化性。值得注意的是,**OBCD** 是首个针对此类问题具有单调性的贪婪下降方法。在 Kurdyka-Lojasiewicz(KL)不等式条件下,我们建立了强极限点收敛性。我们还通过断点搜索方法扩展了 **OBCD** 用于子问题求解,并采用贪婪策略进行工作集选择。综合实验表明,我们的方法在多种任务上均表现出优越性能。