Jaya is a metaheuristic algorithm that uses a pair of random internal parameters to adjust its exploration and exploitation search behaviors. Such a random setting can negatively affect the search performance of the a...
详细信息
Jaya is a metaheuristic algorithm that uses a pair of random internal parameters to adjust its exploration and exploitation search behaviors. Such a random setting can negatively affect the search performance of the algorithm by causing inappropriate search behavior in some iterations. To tackle this issue, the present study deals with developing a new fuzzy decision-making mechanism for dynamic adjusting the trade-off between the exploration and exploitation search behaviors of the Jaya method. The new algorithm is named Fuzzy Reinforced Jaya (FRJ) method. The search capability of the FRJ is evaluated in solving a suite of unconstrained mathematical benchmarks and constrained mechanical and structural optimization problems with buckling and natural frequency constraints. Also, the relevant decision variables are selected from both continuous and discrete domains. To provide a deeper insight into the effect of the defined auxiliary fuzzy module, the performance of the algorithm is evaluated and discussed using normalized diversity concept and behavioral diagrams. Also, employing different statistical analyses (e.g., Q-Q diagrams, Wilcoxson and Friedman tests), the significance of the outcomes is evaluated. Also, the numeric achievements are compared with six other well-stablished techniques. Attained outcomes indicate that the proposed FRJ, as a self-adaptive and parameter-free method, provides superior and promising results in the terms of stability, accuracy, and computational cost in solving mathematical and structural optimization problems.
Computational efficiency of metaheuristic optimization algorithms depends on appropriate balance between exploration and exploitation. An important concern in metaheuristic optimization is that there is no guarantee t...
详细信息
Computational efficiency of metaheuristic optimization algorithms depends on appropriate balance between exploration and exploitation. An important concern in metaheuristic optimization is that there is no guarantee that new trial designs will always improve the current best record. In this regard, there not exist any metaheuristic algorithm inherently superior over all other methods. This study compares three advanced formulations of state-of-the-art metaheuristic optimization algorithms - Simulated Annealing (SA), Harmony Search (HS) and Big Bang-Big Crunch (BBBC) - including enhanced approximate line search and computationally cheap gradient evaluation strategies. The rationale behind the new formulations is to generate high quality trial designs lying on a properly chosen set of descent directions. This is done throughout the optimization process. Besides hybridizing the metaheuristic search engines of HS/BBBC/SA with gradient information and approximate line search, HS and BBBC are also hybridized with an enhanced 1-D probabilistic search derived from SA. All these enhancements allow to approach more quickly the region of design space hosting the global optimum. The new algorithms are tested in four weight minimization problems of skeletal structures and three mechanical/civil engineering design problems with up to 204 continuous/discrete variables and 20,070 nonlinear constraints. All test problems may contain multiple local minima. The optimization results and an extensive comparison with the literature clearly demonstrate the validity of the proposed approach which allows to significantly reduce the number of function evaluations/structural analyses with respect to the literature and improves robustness of metaheuristic search engines. (C) 2020 Elsevier Ltd. All rights reserved.
This article considers the Particle Swarm optimization (PSO), Gravity Search algorithm (GSA), and Gray Wolf Optimizer (GWO) robust varieties to obtain fresh insight into solving different types of engineering problems...
详细信息
This article considers the Particle Swarm optimization (PSO), Gravity Search algorithm (GSA), and Gray Wolf Optimizer (GWO) robust varieties to obtain fresh insight into solving different types of engineering problems without any adherence to the parameters. In the proposed algorithm, the hybridizing of Gravity search algo-rithm, Particle swarm optimization, and Gray wolf optimizer (HGPG) utilize to introduce appropriate numerical optimization, and the theoretical findings are supported by solving several benchmark problems. Achieving good results in mathematical functions (CEC2005, 2021) compared with the GWO, GSA, and PSO methods and some other well-known heuristic algorithms shows an enhanced performance of the introduced method compared to the others. The method in a specific case presents a solution to minimize the weight of space truss structures to a logical level and economize the construction of these structures. The results show that the new algorithm made a notable improvement in both exploration and exploitation. Furthermore, a probabilistic metric analysis has also been made of how much better the proposed algorithm is.
The size of the population is extremely important when executing a population-based algorithm. Its portioning impacts how much the algorithm will have for exploration and exploitation. An excessively large population ...
详细信息
ISBN:
(纸本)9781728183923
The size of the population is extremely important when executing a population-based algorithm. Its portioning impacts how much the algorithm will have for exploration and exploitation. An excessively large population can benefit exploration as opposed to exploitation. With a population below ideal, exploration may be impaired, and the algorithm may quickly converge to a local optimum. Unfortunately, the choice of population size is often made empirically, where the user experiences different values, several times, for different problems, without any well-defined criteria, often drawing only on his experience. This type of approach can under-use the algorithm, generating waste in both computational cost and results. In this work, we improve and study an approximation metamodel as a particle reduction criterion for particle swarm algorithms. This metamodel considers that if two particles are relatively close and with similar velocitys, they will tend to the same solution, allowing one of these to be eliminated. Five traditional benchmark problems in the literature in the field of engineering applications were performed and the results analyzed.
暂无评论