This paper presents a new method based on the use of an optimization approach along with kernel least mean square (KLMS) algorithm for solving ordinary differential equations (ODEs). The new approach in comparison wit...
详细信息
This paper presents a new method based on the use of an optimization approach along with kernel least mean square (KLMS) algorithm for solving ordinary differential equations (ODEs). The new approach in comparison with the other existing methods (such as numerical methods and the methods that are based on neural networks) has more advantages such as simple implementation, fast convergence, and also little error. In this paper, we use the ability of KLMS in prediction by applying an optimization method to predict the solution of ODE. The basic idea is that first a trial solution of the ODE is written by using the KLMS structure, and then by defining an error function and minimizing it via an optimization algorithm (in this paper, we used the quasi-Newton bfgs method), the parameters of KLMS are adjusted such that the trial solution satisfies the DE. After the optimization step, the achieved optimal parameters of the KLMS model are replaced in the trial solution. The accuracy of the method is illustrated by solving several problems.
In order to overcome the disadvantages such as low calculation precision and convergence rate of traditional BP neural network algorithm, a kind of nonlinear optimization method-bfgs method for unconstrained extreme p...
详细信息
In order to overcome the disadvantages such as low calculation precision and convergence rate of traditional BP neural network algorithm, a kind of nonlinear optimization method-bfgs method for unconstrained extreme problem is introduced into BP neural network algorithm, and a bfgs-BP neural network model is developed, which is applied well in tunnel deformation monitoring data processing and forecasting with uncertainty and nonlinearity. With the example of the observation data of vault crown settlement of some tunnel construction process, the test of training and forecast experiments of bfgs- BP were developed. The result shows that bfgs-BP model has higher calculation precision and convergence rate than the traditional one.
In order to overcome the disadvantages such as low calculation precision and convergence rate of traditional BP neural network algorithm, a kind of nonlinear optimization method-bfgs method for unconstrained extreme p...
详细信息
In order to overcome the disadvantages such as low calculation precision and convergence rate of traditional BP neural network algorithm, a kind of nonlinear optimization method-bfgs method for unconstrained extreme problem is introduced into BP neural network algorithm, and a bfgs-BP neural network model is developed, which is applied well in tunnel deformation monitoring data processing and forecasting with uncertainty and nonlinearity. With the example of the observation data of vault crown settlement of some tunnel construction process, the test of training and forecast experiments of bfgs - BP were developed. The result shows that bfgs-BP model has higher calculation precision and convergence rate than the traditional one.
A modified bfgs algorithm for solving the unconstrained optimization, whose Hessian matrix at the minimum point of the convex function is of rank defects, is presented in this paper. The main idea of the algorithm is ...
详细信息
A modified bfgs algorithm for solving the unconstrained optimization, whose Hessian matrix at the minimum point of the convex function is of rank defects, is presented in this paper. The main idea of the algorithm is first to add a modified term to the convex function for obtain an equivalent model, then simply the model to get the modified bfgs algorithm. The superlinear convergence property of the algorithm is proved in this paper. To compared with the Tensor algorithms presented by R. B. Schnabel (seing [4],[5]), this method is more efficient for solving singular unconstrained optimization in computing amount and complication.
In this paper, a modification of the bfgs algorithm for unconstrained non-convex optimization is proposed. The idea of the algorithm is to modify the approximate Hessian matrix for obtaining the descent direction and ...
详细信息
In this paper, a modification of the bfgs algorithm for unconstrained non-convex optimization is proposed. The idea of the algorithm is to modify the approximate Hessian matrix for obtaining the descent direction and guaranteeing the efficacious of the new quasi-Newton iteration equation B(k+1)s(k) = y(k)*, where y(k)* is the sum of y(k) and t(k) parallel to g(x(k))parallel to s(k). The global convergence property of the algorithm associated with the general line search rule is prove.
Using the Arbitrary Lagrangian Eulerian coordinates and the least squares method, a two-dimensional steady fluid structure interaction problem is transformed in an optimal control problem. Sensitivity analysis is pres...
详细信息
Using the Arbitrary Lagrangian Eulerian coordinates and the least squares method, a two-dimensional steady fluid structure interaction problem is transformed in an optimal control problem. Sensitivity analysis is presented. The bfgs algorithm gives satisfactory numerical results even when we use a reduced number of discrete controls. (c) 2005 Elsevier Ltd. All rights reserved.
This article studies a modified bfgs algorithm for solving smooth unconstrained strongly convex minimization problem. The modified bfgs method is based on the new quasi-Newton equation Bk+1 s(k) = y(k) where y(k)* = y...
详细信息
This article studies a modified bfgs algorithm for solving smooth unconstrained strongly convex minimization problem. The modified bfgs method is based on the new quasi-Newton equation Bk+1 s(k) = y(k) where y(k)* = y(k) + A(k)s(k) and A(k) is a matrix. Wei, Li and Qi [WLQ] have proven that the average performance of two of those algorithms is better than that of the classical one. In this paper, we prove the global convergence of these algorithms associated to a general line search rule.
To the unconstrained programme of non-convex function, this article give a modified bfgs algorithm associated with the general line search model. The idea of the algorithm is to modify the approximate Hessian matrix f...
详细信息
To the unconstrained programme of non-convex function, this article give a modified bfgs algorithm associated with the general line search model. The idea of the algorithm is to modify the approximate Hessian matrix for obtaining the descent direction and guaranteeing the efficacious of the new quasi-Newton iteration equation B(k+ 1)s(k) = y(k)(*), where y(k)(*) is the sum of y(k) and A(k)s(k), and A(k) is some matrix. The global convergence properties of the algorithm associating with the general form of line search is proved.
This study proposes a new robust quasi-Newton algorithm for unconstrained optimization problem. The factorization of approximating Hessian matrices is investigated to provide a series of positive bases for pattern sea...
详细信息
This study proposes a new robust quasi-Newton algorithm for unconstrained optimization problem. The factorization of approximating Hessian matrices is investigated to provide a series of positive bases for pattern search. Experiments on some well-known optimization test problems are presented to show the efficiency and robustness of the proposed algorithm. It is found that the proposed algorithm is competitive and outperforms some other derivative-free algorithms. (c) 2006 Elsevier Inc. All rights reserved.
To the unconstrained programme of non-convex function, this article give a modified bfgs algorithm. The idea of the algorithm is to modify the approximate Hessian matrix for obtaining the descent direction and guarant...
详细信息
To the unconstrained programme of non-convex function, this article give a modified bfgs algorithm. The idea of the algorithm is to modify the approximate Hessian matrix for obtaining the descent direction and guaranteeing the efficacious of the quasi-Newton iteration pattern. We prove the global convergence properties of the algorithm associating with the general form of line search, and prove the quadratic convergence rate of the algorithm under some conditions.
暂无评论