This paper presents the logical relationships of Aristotle's square of opposition on four basic categorial prepositions (i.e., contrary, contradictory, subcontrary, and subaltern) of Joint Opposite Selection (JOS)...
详细信息
This paper presents the logical relationships of Aristotle's square of opposition on four basic categorial prepositions (i.e., contrary, contradictory, subcontrary, and subaltern) of Joint Opposite Selection (JOS). JOS brings a mutual reinforcement by a joint of the two opposition strategies Dynamic Opposite (DO) and Selective Leading Opposition (SLO). The DO and SLO improve the balance of exploration and exploitation, respectively, in a given search space. We also propose an enhancement of Golden Jackal optimization (GJO) with a Joint Opposite Selection named GJO-JOS. In the optimization process, JOS assists GJO in assaulting the prey swiftly using SLO. DO assists GJO in finding better chances to locate the fittest prey. With JOS, the GJO succeeds in elevating its performance. We evaluated the performance of GJO-JOS on the CEC 2017 benchmark functions. The benchmark includes unimodal, multimodal, hybrid, and composition functions. The evaluation results of GJO-JOS were better than GJO using each of the seven single opposition-based learning strategies (OBLs). We also compared GJO-JOS to eight nature-inspired algorithms including the original version of GJO. GJO-JOS produced promising results among seven single OBLs, eight nature-inspired algorithms, and GJO. The experimental results confirmed that GJO-JOS effectively generated equilibrium in the balance mechanism.
One of the important techniques used for solving the constrained and unconstrained optimizationproblems is the conjugate gradient CG method, which is one of the efficient techniques used for solving the unconstrained...
详细信息
One of the important techniques used for solving the constrained and unconstrained optimizationproblems is the conjugate gradient CG method, which is one of the efficient techniques used for solving the unconstrained optimizationproblems due to its advantages of ease of application and its lack of need for large storage capacities as well as its ability at the same time to solve problems that need a high storage capacity (large scale). In this work, this method was modified by combining beta(k) parameters of the Polak-Ribiere (PR) method and beta(k) of the Hestenes-Stiefel (HS) method to solve these kinds of problems. By comparing this method with three famous algorithms, using the Matlab program, the numerical expirement showed that our proposed method is the best.
暂无评论