Only a small number of function evaluations can be afforded in many real-world multiobjectiveoptimization problems (MOPs) where the function evaluations are economically/computationally expensive. Such problems pose ...
详细信息
Only a small number of function evaluations can be afforded in many real-world multiobjectiveoptimization problems (MOPs) where the function evaluations are economically/computationally expensive. Such problems pose great challenges to most existing multiobjective evolutionary algorithms (EAs), which require a large number of function evaluations for optimization. Surrogate-assisted EAs (SAEAs) have been employed to solve expensive MOPs. Specifically, a certain number of expensive function evaluations are used to build computationally cheap surrogate models for assisting the optimization process without conducting expensive function evaluations. The infill sampling criteria in most existing SAEAs take all requirements on convergence, diversity, and model uncertainty into account, which is, however, not the most efficient in exploiting the limited computational budget. Thus, this article proposes a Kriging-assisted two-archive EA for expensive many-objective optimization. The proposed algorithm uses one influential point-insensitive model to approximate each objective function. Moreover, an adaptive infill criterion that identifies the most important requirement on convergence, diversity, or uncertainty is proposed to determine an appropriate sampling strategy for reevaluations using the expensive objective functions. The experimental results on a set of expensive multi/many-objective test problems have demonstrated its superiority over five state-of-the-art SAEAs.
expensive multi-objective problems (MOPs) are extremely challenging due to the high evaluation cost to find satisfying solutions with adequate precision, especially in high-dimensional cases. However, most of the curr...
详细信息
expensive multi-objective problems (MOPs) are extremely challenging due to the high evaluation cost to find satisfying solutions with adequate precision, especially in high-dimensional cases. However, most of the current EGO-based algorithms for expensive MOPs are limited to low decision dimensions because of the exponential difficulty in high dimensional circumstances. This paper presents High-Dimensional expensive Multi-objective optimization with Additive structure (ADD-HDEMO) to solve high-dimensional expensive MOPs via additive structural kernel and identifies two key challenges in this endeavor. First, we integrate multiple sub-objectives in high-dimensional expensive MOPs into a single objective with the decision space unchanged. Then, we infer the dependence correlation between the decision and objective space of the augmented EMOP via an additive GP kernel structure where Gibbs sampling is used to learn the latent additive structure. Furthermore, we parallel the proposed algorithm by introducing a multi- point sampling mechanism when recommending infill points. The effectiveness of the proposed method is evaluated on ZDT and DTLZ benchmarks compared with three other EGO-based multi-objective optimization approaches, ParEGO, SMS-EGO and MOEA/D-EGO. Our analyses demonstrate that ADD-HDEMO is effective in solving high-dimensional expensive MOPs. (c) 2022 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license ( http://***/licenses/by-nc-nd/4.0/ )
expensive black-box problems are usually optimized by Bayesian optimization (BO) since it can reduce evaluation costs via cheaper surrogates. The most popular model used in Bayesian optimization is the Gaussian proces...
详细信息
ISBN:
(纸本)9781450367486
expensive black-box problems are usually optimized by Bayesian optimization (BO) since it can reduce evaluation costs via cheaper surrogates. The most popular model used in Bayesian optimization is the Gaussian process (GP) whose posterior is based on a joint GP prior built by initial observations, so the posterior is also a Gaussian process. Observations are often not noise-free, so in most of these cases, a noisy transformation of the objective space is observed. Many single objective optimization algorithms have succeeded in extending efficient global optimization (EGO) to noisy circumstances, while ParEGO fails to consider noise. In order to deal with noisy expensive black-box problems, we extending ParEGO to noisy optimization according to adding a Gaussian noisy error while approximating the surrogate. We call it noisy-ParEGO and results of S-metric indicate that the algorithm works well on optimizing noisy expensivemultiobjective black-box problems.
Many applications such as hyper-parameter tunning in Machine Learning can be casted to multiobjective black-box problems and it is challenging to optimize them. Bayesian optimization (130) is an effective method to de...
详细信息
ISBN:
(纸本)9781450367486
Many applications such as hyper-parameter tunning in Machine Learning can be casted to multiobjective black-box problems and it is challenging to optimize them. Bayesian optimization (130) is an effective method to deal with black-box functions. This paper mainly focuses on balancing exploration and exploitation in multiobjective black-box optimization problems by multiple samplings in BBO. In each iteration, multiple reconunendations are generated via two different trade-off strategies respectively, the expected improvement (El) and a multiobjective framework with the mean and variance function of the GP posterior forming two conflict objectives. We compare our algorithm with ParEGO by running on 12 test functions. Hypervoltune (HV, also known as S-metric) results show that our algoritlun works well in exploration-exploitation trade-off for multiobjective black-box optimization problems.
暂无评论