The inertialproximal method is extended to minimize the sum of a series of separable nonconvex and possibly nonsmooth objective functions and a smooth nonseparable function (possibly nonconvex). Here, we propose two ...
详细信息
The inertialproximal method is extended to minimize the sum of a series of separable nonconvex and possibly nonsmooth objective functions and a smooth nonseparable function (possibly nonconvex). Here, we propose two new algorithms. The first one is an inertialproximal coordinate subgradient algorithm, which updates the variables by employing the proximal subgradients of each separable function at the current point. The second one is an inertialproximal block coordinate method, which updates the variables by using the subgradients of the separable functions at the partially updated points. Global convergence is guaranteed under the Kurdyka-& Lstrok;ojasiewicz (K & Lstrok;) property and some additional mild assumptions. Convergence rate is derived based on the & Lstrok;ojasiewicz exponent. Two numerical examples are given to illustrate the effectiveness of the algorithms. (c) 2024 Published by Elsevier Ltd.
We investigate the convergence of a forward-backward-forward proximal-type algorithm with inertial and memory effects when minimizing the sum of a nonsmooth function with a smooth one in the absence of convexity. The ...
详细信息
We investigate the convergence of a forward-backward-forward proximal-type algorithm with inertial and memory effects when minimizing the sum of a nonsmooth function with a smooth one in the absence of convexity. The convergence is obtained provided an appropriate regularization of the objective satisfies the Kurdyka-Aojasiewicz inequality, which is for instance fulfilled for semi-algebraic functions.
We propose a forward-backward proximal-type algorithm with inertial/memory effects for minimizing the sum of a nonsmooth function with a smooth one in the nonconvex setting. Every sequence of iterates generated by the...
详细信息
We propose a forward-backward proximal-type algorithm with inertial/memory effects for minimizing the sum of a nonsmooth function with a smooth one in the nonconvex setting. Every sequence of iterates generated by the algorithm converges to a critical point of the objective function provided an appropriate regularization of the objective satisfies the Kurdyka-Lojasiewicz inequality, which is for instance fulfilled for semi-algebraic functions. We illustrate the theoretical results by considering two numerical experiments: the first one concerns the ability of recovering the local optimal solutions of nonconvex optimization problems, while the second one refers to the restoration of a noisy blurred image.
In this work we propose an accelerated algorithm that combines various techniques, such as inertial proximal algorithms, Tseng's splitting algorithm, and more, for solving the common variational inclusion problem ...
详细信息
In this work we propose an accelerated algorithm that combines various techniques, such as inertial proximal algorithms, Tseng's splitting algorithm, and more, for solving the common variational inclusion problem in real Hilbert spaces. We establish a strong convergence theorem of the algorithm under standard and suitable assumptions and illustrate the applicability and advantages of the new scheme for signal recovering problem arising in compressed sensing.
暂无评论