We consider a class of structured fractional minimization problems, where the numerator includes a differentiable function, a simple nonconvex nonsmooth function, a concave nonsmooth function, and a convex nonsmooth function composed with a linear operator, while the denominator is a continuous function that is either weakly convex or has a weakly convex square root. These problems are widespread and span numerous essential applications in machine learning and data science. Existing methods are mainly based on subgradient methods and smoothing proximal gradient methods, which may suffer from slow convergence and numerical stability issues. In this paper, we introduce {\sf FADMM}, the first Alternating Direction Method of Multipliers tailored for this class of problems. {\sf FADMM} decouples the original problem into linearized proximal subproblems, featuring two variants: one using Dinkelbach's parametric method ({\sf FADMM-D}) and the other using the quadratic transform method ({\sf FADMM-Q}). By introducing a novel Lyapunov function, we establish that {\sf FADMM} converges to $\epsilon$-approximate critical points of the problem within an oracle complexity of $\mathcal{O}(1/\epsilon^{3})$. Our experiments on synthetic and real-world data for sparse Fisher discriminant analysis, robust Sharpe ratio minimization, and robust sparse recovery demonstrate the effectiveness of our approach. Keywords: Fractional Minimization, Nonconvex Optimization, Proximal Linearized ADMM, Nonsmooth Optimization, Convergence Analysis
翻译:我们考虑一类结构化分数极小化问题,其分子包含一个可微函数、一个简单的非凸非光滑函数、一个凹非光滑函数以及一个与线性算子复合的凸非光滑函数,而分母则是一个连续函数,该函数或其平方根具有弱凸性。此类问题广泛存在,涵盖机器学习与数据科学中的众多关键应用。现有方法主要基于次梯度法和平滑近端梯度法,这些方法可能面临收敛速度慢和数值稳定性问题。本文提出首个针对此类问题设计的乘子交替方向法——FADMM。该方法将原问题解耦为线性化近端子问题,并包含两种变体:一种采用Dinkelbach参数化方法(FADMM-D),另一种采用二次变换方法(FADMM-Q)。通过构建新型Lyapunov函数,我们证明FADMM能在$\mathcal{O}(1/\epsilon^{3})$的Oracle复杂度内收敛至问题的$\epsilon$近似临界点。在合成数据与真实数据上进行的稀疏Fisher判别分析、稳健夏普率极小化及稳健稀疏恢复实验验证了本方法的有效性。关键词:分数极小化,非凸优化,近端线性化ADMM,非光滑优化,收敛性分析