estimation-of-distribution algorithms (EDAs) build and use probabilistic models during optimization in order to automatically discover and use an optimization problems' structure. This is especially useful for bla...
详细信息
ISBN:
(纸本)9781450311779
estimation-of-distribution algorithms (EDAs) build and use probabilistic models during optimization in order to automatically discover and use an optimization problems' structure. This is especially useful for black-box optimization where no assumptions are made on the problem being solved, which is characteristic of many cases in solving complex real-world problems. In this paper we consider multi-objective optimization problems with real-valued variables. Although the vast majority of advances in EDA literature concern single-objective optimization, advances have also been made in multi-objective optimization. In this paper we bring together two recent advances, namely incremental Gaussian model building to reduce the required population size and a mixture-based multi-objective framework that has specific methods to better facilitate model-building techniques that span multiple generations. Significantly faster convergence to the optimal Pareto front is achieved on 6 out of 7 artificial benchmark problems from literature. Although results on two of these problems show that building models with higher-order interactions between variables is required, these problems are still artificial. We therefore also consider a more realistic optimization problem in image processing, namely deformable image registration. For this problem too, our results show the need for processing interactions between problem variables, stressing the importance of studying the use of such models. Furthermore, the number of problem variables in the deformable image registration problem can be very large. The building of models with higher-order interactions, especially mixture-based models, then requires very large population sizes. The use of incremental model building is therefore of high importance. This claim is supported by our results that show a huge reduction in the number of required evaluations on this problem.
A key search mechanism in Evolutionary algorithms is the mixing or juxtaposing of partial solutions present in the parent solutions. In this paper we look at the efficiency of mixing in genetic algorithms (GAs) and es...
详细信息
ISBN:
(纸本)9781450305570
A key search mechanism in Evolutionary algorithms is the mixing or juxtaposing of partial solutions present in the parent solutions. In this paper we look at the efficiency of mixing in genetic algorithms (GAs) and estimation-of-distribution algorithms (EDAs). We compute the mixing probabilities of two partial solutions and discuss the effect of the covariance build-up in GAs and EDas. Moreover, we propose two new Evolutionary algorithms that maximize the juxtaposing of the partial solutions present in the parents: the Recombinative Optimal Mixing Evolutionary Algorithm (ROMEA) and the Gene-pool Optimal Mixing Evolutionary Algorithm (GOMEA).
A simple parameter-less local optimizer able to solve deterministic problems with building blocks of bounded order is proposed in this article. The algorithm is able to learn and use linkage information during the run...
详细信息
ISBN:
(纸本)9781450305570
A simple parameter-less local optimizer able to solve deterministic problems with building blocks of bounded order is proposed in this article. The algorithm is able to learn and use linkage information during the run. The algorithm is algorithmically simple, easy to implement and with the exception of termination condition, it is completely parameter-free-there is thus no need to tune the population size and other parameters to the problem at hand. An empirical comparison on 3 decomposable functions, each with uniformly scaled building blocks of size 5 and 8, was carried out. The algorithm exhibits quadratic scaling with the problem dimensionality, but the comparison with the extended compact genetic algorithm and Bayesian optimization algorithm shows that it needs lower or comparable number of fitness function evaluations on the majority of functions for the tested problem dimensionalities. The results also suggest that the efficiency of the local optimizer compared to both the estimation-of-distribution algorithms should be better for problem sizes under at least a few hundreds of bits.
暂无评论