This thesis shows several contributions to the convergence analysis for noisy optimizationalgorithms. We analyze the convergence rates for two algorithms of different type. One of the type linesearch and the other of...
详细信息
This thesis shows several contributions to the convergence analysis for noisy optimizationalgorithms. We analyze the convergence rates for two algorithms of different type. One of the type linesearch and the other of the type Randomized Search Heuristics. We prove that the algorithm using an approximation of the Hessian matrix can reach the same rates as other optimal algorithms, when the parameters are well adjusted. Also, we analyze the convergence order for Evolution Stra- tegies for noisy optimization, using reevaluation. We obtain theoretical an empirical results for log- log convergence. We also prove a lower bound for the convergence rate of Evolution Strategies. We extend the work to the application of similar reeva- luation schemes to a discrete case presenting noisy perturbations in the objective function. Finally, we analyze the measure of performance itself by com- paring two quality indicator over different algo- rithms. We prove that the use of an inadequate indicator can lead to misleading results when com- paring several noisy optimizationalgorithms.
It is known that the conjugate-gradient algorithm is at least as good as the steepest-descent algorithm for minimizing quadratic functions. It is shown here that the conjugate-gradient algorithm is actually superior t...
详细信息
It is known that the conjugate-gradient algorithm is at least as good as the steepest-descent algorithm for minimizing quadratic functions. It is shown here that the conjugate-gradient algorithm is actually superior to the steepest-descent algorithm in that, in the generic case, at each iteration it yields a lower cost than does the steepest-descent algorithm, when both start at the same point.
暂无评论