In this paper, the nonlinear minimax problems with inequality constraints are discussed. Based on the idea of simple sequential quadratically constrained quadratic programming algorithm for smooth constrained optimiza...
详细信息
In this paper, the nonlinear minimax problems with inequality constraints are discussed. Based on the idea of simple sequential quadratically constrained quadratic programming algorithm for smooth constrained optimization, an alternative algorithm for solving the discussed problems is proposed. Unlike the previous work, at each iteration, a feasible direction of descent called main search direction is obtained by solving only one subprogram which is composed of a convex quadratic objective function and simple quadratic inequality constraints without the second derivatives of the constrained functions. Then a high-order correction direction used to avoid the Maratos effect is computed by updating the main search direction with a system of linear equations. The proposed algorithm possesses global convergence under weak Mangasarian-Fromovitz constraint qualification and superlinear convergence under suitable conditions with the upper-level strict complementarity. At last, some preliminary numerical results are reported.
In this paper, we design a low-cost feasible projection algorithm for variational inequalities by replacing the projection onto the feasible set with the projection onto a ball. In each iteration, it only needs to cal...
详细信息
In this paper, we design a low-cost feasible projection algorithm for variational inequalities by replacing the projection onto the feasible set with the projection onto a ball. In each iteration, it only needs to calculate the value of the mapping once, and the projection onto the ball contained in the feasible set (which has an explicit expression), so the algorithm is easier to implement and feasible. The convergence of the algorithm is proved when the Slater condition holds for the feasible set and the mapping is pseudomonotone, Lipschitz continuous. Finally, some numerical examples are given to illustrate the effectiveness of the algorithm.
The projection is often used in solving variational inequalities. When projection onto the feasible set is not easy to calculate, the projection algorithms are replaced by the relaxed projection algorithms. However, t...
详细信息
The projection is often used in solving variational inequalities. When projection onto the feasible set is not easy to calculate, the projection algorithms are replaced by the relaxed projection algorithms. However, these relaxed projection algorithms are not feasible, and to ensure the convergence of these relaxed projection algorithms, in addition to assuming some basic conditions, such as the Slater condition holds for the feasible set, the mapping is pseudomonotone and Lipschitz continuous, but also need to assume some additional conditions, which require some relationship between the mapping and the feasible set. In this paper, by replacing the projection onto the feasible set with the projection onto a ball (which changes from iteration) contained in the feasible set, a new feasible moving ball projection algorithm for pseudomonotone variational inequalities is obtained. Since the projection onto a ball has an explicit expression, this algorithm is easy to implement. At the same time, all the balls are contained in the feasible set, so the iteration points generated by this algorithm are all in the feasible set, which ensures the feasibility of this algorithm. The convergence of this algorithm is proved when the Slater condition holds for the feasible set, and the mapping is pseudomonotone and Lipschitz continuous. The fundamental difference between this moving ball projection algorithm and the previous relaxed projection algorithms lie in that the previous relaxed projection algorithms are all projected onto the half-space containing the feasible set, and this moving ball projection algorithm is projected onto a ball contained in the feasible set. In particular, this algorithm does not need to assume any additional conditions between the mapping and the feasible set. Finally, some numerical examples are given to illustrate the effectiveness of the algorithm.
An algorithm for smooth nonlinear constrained optimization problems is described, in which a sequence of feasible iterates is generated by solving a trust-region sequential quadratic programming (SQP) subproblem at ea...
详细信息
An algorithm for smooth nonlinear constrained optimization problems is described, in which a sequence of feasible iterates is generated by solving a trust-region sequential quadratic programming (SQP) subproblem at each iteration and by perturbing the resulting step to retain feasibility of each iterate. By retaining feasibility, the algorithm avoids several complications of other trust-region SQP approaches: the objective function can be used as a merit function, and the SQP subproblems are feasible for all choices of the trust-region radius. Global convergence properties are analyzed under various assumptions on the approximate Hessian. Under additional assumptions, superlinear convergence to points satisfying second-order sufficient conditions is proved.
The gradient sampling (GS) algorithm for minimizing a nonconvex, nonsmooth function was proposed by Burke et al. (SIAM J Optim 15:751-779, 2005), whose most interesting feature is the use of randomly sampled gradients...
详细信息
The gradient sampling (GS) algorithm for minimizing a nonconvex, nonsmooth function was proposed by Burke et al. (SIAM J Optim 15:751-779, 2005), whose most interesting feature is the use of randomly sampled gradients instead of subgradients. In this paper, combining the GS technique with the sequential quadratic programming (SQP) method, we present a feasible SQP-GS algorithm that extends the GS algorithm to nonconvex, nonsmooth constrained optimization. The proposed algorithm generates a sequence of feasible iterates, and guarantees that the objective function is monotonically decreasing. Global convergence is proved in the sense that, with probability one, every cluster point of the iterative sequence is stationary for the improvement function. Finally, some preliminary numerical results show that the proposed algorithm is effective.
暂无评论