In this paper, we discuss the convergence of the Broyden algorithms with revised search direction. Under some inexact line searches, we prove that the algorithms are globally convergent for continuously differentiable...
详细信息
In this paper, we discuss the convergence of the Broyden algorithms with revised search direction. Under some inexact line searches, we prove that the algorithms are globally convergent for continuously differentiable functions and the rate of local convergence of the algorithms is one-step superlinear and n-step second-order for uniformly convex objective functions.
This paper introduces an algorithm for convex minimization which includes quasi-Newton updates within a proximal point algorithm that depends on a preconditioned bundle subalgorithm. The method uses the Hessian of a c...
详细信息
This paper introduces an algorithm for convex minimization which includes quasi-Newton updates within a proximal point algorithm that depends on a preconditioned bundle subalgorithm. The method uses the Hessian of a certain outer function which depends on the Jacobian of a proximal point mapping which, in turn, depends on the preconditioner matrix and on a Lagrangian Hessian relative to a certain tangent space. Convergence is proved under boundedness assumptions on the preconditioner sequence.
In this paper we discuss the convergence of the Broyden algorithms with revised search direction. We prove that the algorithms are globally convergent for continuously differentiable functions and the rate of converge...
详细信息
In this paper we discuss the convergence of the Broyden algorithms with revised search direction. We prove that the algorithms are globally convergent for continuously differentiable functions and the rate of convergence of the algorithms is one-step superlinear and n-step second-order for uniformly convex objective functions.
Many iterative algorithms for optimization calculations form positive definite second derivative approximations,B say, automatically, butB is not stored explicitly because of the need to solve equations of the formBd-...
详细信息
Many iterative algorithms for optimization calculations form positive definite second derivative approximations,B say, automatically, butB is not stored explicitly because of the need to solve equations of the formBd--g. We consider working with matricesZ, whose columns satisfy the conjugacy conditionsZ 1 BZ=1. Particular attention is given to updatingZ in a way that corresponds to revisingB by the BFGS formula. A procedure is proposed that seems to be much more stable than the direct use of a product formula [1]. An extension to this procedure provides some automatic rescaling of the columns ofZ, which avoids some inefficiencies due to a poor choice of the initial second derivative approximation. Our work is also relevant to active set methods for linear inequality constraints, to updating the Cholesky factorization ofB, and to explaining some properties of the BFGS algorithm.
This is a method for determining numerically local minima of differentiable functions of several variables. In the process of locating each minimum, a matrix which characterizes the behavior of the function about the ...
详细信息
This is a method for determining numerically local minima of differentiable functions of several variables. In the process of locating each minimum, a matrix which characterizes the behavior of the function about the minimum is determined. For a region in which the function depends quadratically on the variables, no more than N iterations are required, where N is the number of variables. By suitable choice of starting values, and without modification of the procedure, linear constraints can be imposed upon the variables.
Many iterative algorithms for optimization calculations use a second derivative approximation, B say, in order to calculate the search direction d = -B-1delf(x). In order to avoid inverting B we work with matrices Z, ...
详细信息
Many iterative algorithms for optimization calculations use a second derivative approximation, B say, in order to calculate the search direction d = -B-1delf(x). In order to avoid inverting B we work with matrices Z, whose columns satisfy the conjugacy relations Z(T) BZ = I. We present an update of Z that is compatible with members of the Broyden family that generate positive definite second derivative approximations. The algorithm requires only 3n2 + O(n) flops for the update of Z and the calculation of d. The columns of the resultant Z matrices have interesting conjugacy and orthogonality properties with respect to previous second derivative approximations and function gradients, respectively. The update also provides a simple proof of Dixon's theorem. For the BFGS method we adapt the algorithm in order to obtain a null space method for linearly constrained calculations.
This paper reviews some of the most successful methods for unconstrained, constrained and nondifferentiable optimization calculations. Particular attention is given to the contribution that theoretical analysis has ma...
详细信息
This paper reviews some of the most successful methods for unconstrained, constrained and nondifferentiable optimization calculations. Particular attention is given to the contribution that theoretical analysis has made to the development of algorithms. It seems that practical considerations provide the main new ideas, and that subsequent theoretical studies give improvements to algorithms, coherence to the subject, and better understanding.
Let B be a positive definite symmetric approximation to the second derivative matrix of the objective function of a minimization calculation. We study modifications of the BFGS method that apply a scaling technique to...
详细信息
Let B be a positive definite symmetric approximation to the second derivative matrix of the objective function of a minimization calculation. We study modifications of the BFGS method that apply a scaling technique to the columns of a conjugate direction matrix Z satisfying Z(T)BZ = I. For a simple scaling technique similar to the ones considered by Powell (1987) and (1989) we show that, due to a two iteration cycle, linear convergence can occur when the method is applied to a quadratic function and Wolfe's line search is employed, although for exact line searches quadratic termination can be proved. We then suggest a different scaling technique that prevents certain columns thought to contain important curvature information from being scaled. For this algorithm we prove global and superlinear convergence and demonstrate the superiority of our method over the BFGS formula on a range of ill-conditioned optimization problems. Moreover, we present an implementation of our algorithm tht requires only 3n2 + O(n) multiplications per iteration.
We consider a dynamical system approach to solve finite-dimensional smooth optimization problems with a compact and connected feasible set. In fact, by the well-known technique of equalizing inequality constraints usi...
详细信息
We consider a dynamical system approach to solve finite-dimensional smooth optimization problems with a compact and connected feasible set. In fact, by the well-known technique of equalizing inequality constraints using quadratic slack variables, we transform a general optimization problem into an associated problem without inequality constraints in a higher-dimensional space. We compute the projected gradient for the latter problem and consider its projection on the feasible set in the original, lower-dimensional space. In this way, we obtain an ordinary differential equation in the original variables, which is specially adapted to treat inequality constraints (for the idea, see Jongen and Stein, Frontiers in Global Optimization, pp. 223-236, Kluwer Academic, Dordrecht, 2003). The article shows that the derived ordinary differential equation possesses the basic properties which make it appropriate to solve the underlying optimization problem: the longtime behavior of its trajectories becomes stationary, all singularities are critical points, and the stable singularities are exactly the local minima. Finally, we sketch two numerical methods based on our approach.
暂无评论