We propose a new self-adaptive and double-loop smoothing algorithm to solve composite, nonsmooth, and constrained convex optimization problems. Our algorithm is based on Nesterov's smoothing technique via general ...
详细信息
We propose a new self-adaptive and double-loop smoothing algorithm to solve composite, nonsmooth, and constrained convex optimization problems. Our algorithm is based on Nesterov's smoothing technique via general Bregman distance functions. It self-adaptively selects the number of iterations in the inner loop to achieve a desired complexity bound without requiring to set the accuracy a priori as in variants of augmented Lagrangian methods (ALM). We prove O mml:mfenced close=) open="("1k convergence rate on the last iterate of the outer sequence for both unconstrained and constrained settings in contrast to ergodic rates which are common in ALM as well as alternating direction method-of-multipliers literature. Compared to existing inexact ALM or quadratic penalty methods, our analysis does not rely on the worst-case bounds of the subproblem solved by the inner loop. Therefore, our algorithm can be viewed as a restarting technique applied to the ASGARD method in Tran-Dinh et al. (SIAM J Optim 28(1):96-134, 2018) but with rigorous theoretical guarantees or as an inexact ALM with explicit inner loop termination rules and adaptive parameters. Our algorithm only requires to initialize the parameters once, and automatically updates them during the iteration process without tuning. We illustrate the superiority of our methods via several examples as compared to the state-of-the-art.
In this paper we consider the general cone programming problem, and propose primal-dual convex (smooth and/or nonsmooth) minimization reformulations for it. We then discuss first-ordermethods suitable for solving the...
详细信息
In this paper we consider the general cone programming problem, and propose primal-dual convex (smooth and/or nonsmooth) minimization reformulations for it. We then discuss first-ordermethods suitable for solving these reformulations, namely, Nesterov's optimal method (Nesterov in Doklady AN SSSR 269:543-547, 1983;Math Program 103:127-152, 2005), Nesterov's smooth approximation scheme (Nesterov in Math Program 103: 127-152, 2005), and Nemirovski's prox-method (Nemirovski in SIAM J Opt 15: 229-251, 2005), and propose a variant of Nesterov's optimal method which has outperformed the latter one in our computational experiments. We also derive iteration-complexity bounds for these first-ordermethods applied to the proposed primal-dual reformulations of the cone programming problem. The performance of these methods is then compared using a set of randomly generated linear programming and semidefinite programming instances. We also compare the approach based on the variant of Nesterov's optimal method with the low-rank method proposed by Burer and Monteiro (Math Program Ser B 95: 329-357, 2003;Math Program 103:427-444, 2005) for solving a set of randomly generated SDP instances.
暂无评论