We apply a modified subgradient algorithm (MSG) for solving the dual of a nonlinear and nonconvex optimization problem. The dual scheme we consider uses the sharp augmented Lagrangian. A desirable feature of this meth...
详细信息
We apply a modified subgradient algorithm (MSG) for solving the dual of a nonlinear and nonconvex optimization problem. The dual scheme we consider uses the sharp augmented Lagrangian. A desirable feature of this method is primal convergence, which means that every accumulation point of a primal sequence (which is automatically generated during the process), is a primal solution. This feature is not true in general for available variants of MSG. We propose here two new variants of MSG which enjoy both primal and dual convergence, as long as the dual optimal set is nonempty. These variants have a very simple choice for the stepsizes. Moreover, we also establish primal convergence when the dual optimal set is empty. Finally, our second variant of MSG converges in a finite number of steps.
In this article, we continue to study the modifiedsubgradient (MSG) algorithm previously suggested by Gasimov for solving the sharp augmented Lagrangian dual problems. The most important features of this algorithm ar...
详细信息
In this article, we continue to study the modifiedsubgradient (MSG) algorithm previously suggested by Gasimov for solving the sharp augmented Lagrangian dual problems. The most important features of this algorithm are those that guarantees a global optimum for a wide class of non-convex optimization problems, generates a strictly increasing sequence of dual values, a property which is not shared by the other subgradient methods and guarantees convergence. The main drawbacks of MSG algorithm, which are typical for many subgradientalgorithms, are those that uses an unconstrained global minimum of the augmented Lagrangian function and requires knowing an approximate upper bound of the initial problem to update stepsize parameters. In this study we introduce a new algorithm based on the so-called feasible values and give convergence theorems. The new algorithm does not require to know the optimal value initially and seeks it iteratively beginning with an arbitrary number. It is not necessary to find a global minimum of the augmented Lagrangian for updating the stepsize parameters in the new algorithm. A collection of test problems are used to demonstrate the performance of the new algorithm.
In literature, economic dispatch problems are generally categorized as convex and nonconvex optimization problems. In this study, a solution is proposed for economic dispatch problem with valve point effect, which is ...
详细信息
In literature, economic dispatch problems are generally categorized as convex and nonconvex optimization problems. In this study, a solution is proposed for economic dispatch problem with valve point effect, which is one of the nonconvex optimization problems. For this reason the hybrid approach used for solution of this problem is formed as a combination of modifiedsubgradient (MSG) and harmony search (HS) algorithms. This approach (MSG-HS) is applied in three different lossy test systems (Three machines 6-bus, IEEE 5-machines 14-bus, IEEE 6-machines 30-bus systems) solved with different methods in the literature. System losses are calculated by using B loss matrix. The resulting optimal solution values are compared with the solution values in the literature and the results are discussed. (C) 2011 Elsevier Ltd. All rights reserved.
In this paper, a novel sharp Augmented Lagrangian-based global optimization method is developed for solving constrained non-convex optimization problems. The algorithm consists of outer and inner loops. At each inner ...
详细信息
In this paper, a novel sharp Augmented Lagrangian-based global optimization method is developed for solving constrained non-convex optimization problems. The algorithm consists of outer and inner loops. At each inner iteration, the discrete gradient method is applied to minimize the sharp augmented Lagrangian function. Depending on the solution found the algorithm stops or updates the dual variables in the inner loop, or updates the upper or lower bounds by going to the outer loop. The convergence results for the proposed method are presented. The performance of the method is demonstrated using a wide range of nonlinear smooth and non-smooth constrained optimization test problems from the literature.
The real structured singular value (RSSV, or real mu) is a useful measure to analyze the robustness of linear systems subject to structured real parametric uncertainty, and surely a valuable design tool for the contro...
详细信息
The real structured singular value (RSSV, or real mu) is a useful measure to analyze the robustness of linear systems subject to structured real parametric uncertainty, and surely a valuable design tool for the control systems engineers. We formulate the RSSV problem as a nonlinear programming problem and use a new computation technique, F-modifiedsubgradient (F-MSG) algorithm, for its lower bound computation. The F-MSG algorithm can handle a large class of nonconvex optimization problems and requires no differentiability. The RSSV computation is a well known NP hard problem. There are several approaches that propose lower and upper bounds for the RSSV. However, with the existing approaches, the gap between the lower and upper bounds is large for many problems so that the benefit arising from usage of RSSV is reduced significantly. Although the F-MSG algorithm aims to solve the nonconvex programming problems exactly, its performance depends on the quality of the standard solvers used for solving subproblems arising at each iteration of the algorithm. In the case it does not find the optimal solution of the problem, due to its high performance, it practically produces a very tight lower bound. Considering that the RSSV problem can be discontinuous, it is found to provide a good fit to the problem. We also provide examples for demonstrating the validity of our approach.
暂无评论