We study the applicability of the Peaceman-Rachford (PR) splitting method for solving nonconvex optimization problems. When applied to minimizing the sum of a strongly convex Lipschitz differentiable function and a pr...
详细信息
We study the applicability of the Peaceman-Rachford (PR) splitting method for solving nonconvex optimization problems. When applied to minimizing the sum of a strongly convex Lipschitz differentiable function and a proper closed function, we show that if the strongly convex function has a large enough strong convexity modulus and the step-size parameter is chosen below a threshold that is computable, then any cluster point of the sequence generated, if exists, will give a stationary point of the optimization problem. We also give sufficient conditions guaranteeing boundedness of the sequence generated. We then discuss one way to split the objective so that the proposed method can be suitably applied to solving optimizationproblems with a coercive objective that is the sum of a (not necessarily strongly) convex Lipschitz differentiable function and a proper closed function;this setting covers a large class of nonconvex feasibility problems and constrained least squares problems. Finally, we illustrate the proposed algorithm numerically.
In this paper, we propose generalized splitting methods for solving a class of nonconvex optimization problems. The new methods are extended from the classic Douglas-Rachford and Peaceman-Rachford splitting methods. T...
详细信息
In this paper, we propose generalized splitting methods for solving a class of nonconvex optimization problems. The new methods are extended from the classic Douglas-Rachford and Peaceman-Rachford splitting methods. The range of the new step-sizes even can be enlarged two times for some special cases. The new methods can also be used to solve convex optimizationproblems. In particular, for convex problems, we propose more relax conditions on step-sizes and other parameters and prove the global convergence and iteration complexity without any additional assumptions. Under the strong convexity assumption on the objective function, the linear convergence rate can be derived easily.
A regularization term is introduced into the approximate Hessian update in the stochastic Broyden-Fletcher-Goldfarb-Shanno (BFGS) method for convex stochastic optimizationproblems to help avoid near-singularity issue...
详细信息
A regularization term is introduced into the approximate Hessian update in the stochastic Broyden-Fletcher-Goldfarb-Shanno (BFGS) method for convex stochastic optimizationproblems to help avoid near-singularity issues. Additionally, Nesterov acceleration, with a momentum coefficient that dynamically adjusts between a constant value and zero based on the objective function, has been incorporated to enhance convergence speed. However, the inflexibility of the constant momentum coefficient still may lead to overshooting problems, and evaluating objective functions on large datasets is computationally costly. Moreover, this approach presents challenges in solving nonconvex optimization problems. To address these challenges, we propose a regularized stochastic BFGS method that integrates Nesterov acceleration with an adaptive momentum coefficient designed for solving nonconvex stochastic optimizationproblems. This coefficient adjusts flexibly between a decreasing value and zero based on selected dataset samples, helping to avoid overshooting problems and reduce computational costs. We demonstrated almost sure convergence to stationary points and analyze the complexity. Numerical results on convex and nonconvex classification problems using a support vector machine show that our method outperforms existing approaches.
The problem of numerical finding of a Nash equilibrium in a 3-player polymatrix game is considered. Such a game can be completely described by six matrices, and it turns out to be equivalent to the solving a nonconvex...
详细信息
The problem of numerical finding of a Nash equilibrium in a 3-player polymatrix game is considered. Such a game can be completely described by six matrices, and it turns out to be equivalent to the solving a nonconvexoptimization problem with a bilinear structure in the objective function. Special methods of local and global search for the optimization problem are proposed and investigated. The results of computational solution of the test game are presented and analyzed.
The linear-linear and quadratic-linear bilevel programming problems are considered. Their optimistic statement is reduced to a nonconvex mathematical programming problem with the bilinear structure. Approximate algori...
详细信息
The linear-linear and quadratic-linear bilevel programming problems are considered. Their optimistic statement is reduced to a nonconvex mathematical programming problem with the bilinear structure. Approximate algorithms of local and global search in the obtained problems are proposed. The results of computational solving randomly generated test problems are given and analyzed.
Consider the following Bolza problem: min integral h(x, u)dt, x = F(x) + uG(x), \u\ less-than-or-equal-to 1, x is-an-element-of OMEGA subset-of R2, x(0) = x0, x(1) = x1. We show that, under suitable assumptions on F, ...
详细信息
Consider the following Bolza problem: min integral h(x, u)dt, x = F(x) + uG(x), \u\ less-than-or-equal-to 1, x is-an-element-of OMEGA subset-of R2, x(0) = x0, x(1) = x1. We show that, under suitable assumptions on F, G, h, all optimal trajectories are bang-bang. The proof relies on a geometrical approach that works for every smooth two-dimensional manifold. As a corollary, we obtain existence results for nonconvex optimization problems.
The Quasi-Newton method is one of the most effective methods using the first derivative for solving all unconstrained optimizationproblems. The Broyden family method plays an important role among the quasi-Newton alg...
详细信息
The Quasi-Newton method is one of the most effective methods using the first derivative for solving all unconstrained optimizationproblems. The Broyden family method plays an important role among the quasi-Newton algorithms. However, the study of the convergence of the classical Broyden family method is still not enough. While in the special case, BFGS method, there have been abundant achievements. Yuan et al. (Appl Math Model. 47:811-825, (2017)) presented a modified weak Wolfe-Powell line search and obtained the convergence of BFGS method for general functions under this line search. Motivated by their works, a new modified weak Wolfe-Powell line search technique is proposed for unconstrained problems. We assume that the objective function is nonconvex and the global convergence of the restricted Broyden family method is established. Preliminary numerical results including the classical optimizationproblems and the Muskingum model show that the presented algorithm is promising.
A polynomial optimization problem whose objective function is represented as a sum of positive and even powers of polynomials, called a polynomial least squares problem, is considered. Methods to transform a polynomia...
详细信息
A polynomial optimization problem whose objective function is represented as a sum of positive and even powers of polynomials, called a polynomial least squares problem, is considered. Methods to transform a polynomial least square problem to polynomial semidefinite programs to reduce degrees of the polynomials are discussed. Computational efficiency of solving the original polynomial least squares problem and the transformed polynomial semidefinite programs is compared. Numerical results on selected polynomial least square problems show better computational performance of a transformed polynomial semidefinite program, especially when degrees of the polynomials are larger.
A quadratic-linear bilevel programming problem is considered. Its optimistic statement is reduced to a nonconvex mathematical programming problem with a quadratic-bilinear structure. An approximate algorithm of a loca...
详细信息
A quadratic-linear bilevel programming problem is considered. Its optimistic statement is reduced to a nonconvex mathematical programming problem with a quadratic-bilinear structure. An approximate algorithm of a local search in the problem obtained is proposed, proved, and tested.
A quadratic-linear bilevel programming problem is considered. Its optimistic statement is reduced to a series of nonconvex unilevel problems. An approximate algorithm for global search in reduced problems is proposed....
详细信息
A quadratic-linear bilevel programming problem is considered. Its optimistic statement is reduced to a series of nonconvex unilevel problems. An approximate algorithm for global search in reduced problems is proposed. Numerical solutions of randomly generated test problems are given and analyzed.
暂无评论