In recent times, bigdata has become an essential concern with the rapid increase of digitalization. The problems that find solutions to the problems of finding and evaluating the features of bigdata are called optim...
详细信息
In recent times, bigdata has become an essential concern with the rapid increase of digitalization. The problems that find solutions to the problems of finding and evaluating the features of bigdata are called optimization problems. In this paper, data sets containing EEG signals have been studied. The goal is to detect actual EEG signals while eliminating additional brain activity patterns in the collected data, resulting in more accurate interpretation. In the study, to handle big data optimization (big(Opt)) difficulties, a novel swarm intelligence-based technique is developed. A Developed PSO-Q was proposed by updating the random walking phase of the Particle Swarm optimization on the combined quantum behaved method (PSO-Q) for big(Opt) problems. PSO-Q's local search capability has been improved. The success of PSO-Q and IPSO-Q has been thoroughly tested in various cycles (maximum iterations) (300, 400, 500, and 1000) and population sizes (10, 25, and 50) on six data sets. The outcomes of the PSO-Q and IPSO-Q were statistically evaluated with the Wilcoxon Signed-Rank Test. PSO-Q and IPSO-Q have been compared with newly developed swarm-based algorithms (BA, Jaya, AOA, etc.) in the literature in recent years. The success of IPSO-Q has been shown by evaluating the results obtained. The results showed that IPSO-Q can be used as an alternative algorithm in big(Opt) problems.
Biological systems where order arises from disorder inspires for many metaheuristic optimization techniques. Self-organization and evolution are the common behaviour of chaos and optimization algorithms. Chaos can be ...
详细信息
Biological systems where order arises from disorder inspires for many metaheuristic optimization techniques. Self-organization and evolution are the common behaviour of chaos and optimization algorithms. Chaos can be defined as an ordered state of disorder that is hypersensitive to initial conditions. Therefore, chaos can help create order out of disorder. In the scope of this work, Golden Ratio Guided Local Search method was improved with inspiration by chaos and named as Chaotic Golden Ratio Guided Local Search (CGRGLS). Chaos is used as a random number generator in the proposed method. The coefficient in the equation for determining adaptive step size was derived from the Singer Chaotic Map. Performance evaluation of the proposed method was done by using CGRGLS in the local search part of MLSHADE-SPA algorithm. The experimental studies carried out with the electroencephalographic signal decomposition based optimization problems, named as big data optimization problem (big-Opt), introduced at the Congress on Evolutionary Computing bigdata Competition (CEC'2015). Experimental results have shown that the local search method developed using chaotic maps has an effect that increases the performance of the algorithm.& COPY;2023 Karabuk University. Publishing services by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://***/licenses/by-nc-nd/4.0/).
Introduction: big data optimization (big-Opt) problems present unique challenges in effectively managing and optimizing the analytical properties inherent in large-scale datasets. The complexity and size of these prob...
详细信息
The bigdata term and its formal definition have changed the properties of some of the computational problems. One of the problems for which the fundamental properties change with the existence of the bigdata is the ...
详细信息
The bigdata term and its formal definition have changed the properties of some of the computational problems. One of the problems for which the fundamental properties change with the existence of the bigdata is the optimization problems. Artificial bee colony (ABC) algorithm inspired by the intelligent source search, consumption and communication characteristics of the real honey bees has proven its efficiency on solving different numerical and combinatorial optimization problems. In this study, the standard ABC algorithm and its well-known variants including the gbest-guided ABC algorithm, the differential evolution based ABC/best/1 and ABC/best/2 algorithms, crossover ABC algorithm, converge-onlookers ABC algorithm and quick ABC algorithm were assessed using the electroencephalographic signal decomposition based optimization problems introduced at the 2015 Congress on Evolutionary Computing bigdata Competition. The experimental studies on solving big data optimization problems showed that the phase-divided structure of the standard ABC algorithm still protects its advantageous sides when the candidate food sources or solutions are generated by referencing the global best solution in the onlooker bee phase.
In recent years, the researchers have witnessed the changes or transformations driven by the existence of the bigdata on the definitions, complexities and future directions of the real world optimization problems. An...
详细信息
In recent years, the researchers have witnessed the changes or transformations driven by the existence of the bigdata on the definitions, complexities and future directions of the real world optimization problems. Analyzing the capabilities of the previously introduced techniques, determining possible drawbacks of them and developing new methods by taking into consideration of the unique properties related with the bigdata are nowadays in urgent demands. Artificial Bee Colony (ABC) algorithm inspired by the clever foraging behaviors of the real honey bees is one of the most successful swarm intelligence based optimization algorithms. In this study, a novel ABC algorithm based big data optimization technique was proposed. For exploring the solving abilities of the proposed technique, a set of experimental studies has been carried out by using different signal decomposition based big data optimization problems presented at the Congress on Evolutionary Computation (CEC) 2015 big data optimization Competition. The results obtained from the experimental studies first were compared with the well-known variants of the standard ABC algorithm named gbest-guided ABC (GABC), ABC/best/1, ABC/best/2, crossover ABC (CABC), converge-onlookers ABC (COABC) and quick ABC (qABC). The results of the proposed ABC algorithm were also compared with the Differential Evolution (DE) algorithm, Genetic algorithm (GA), Firefly algorithm (FA), Fireworks algorithm (FW), Phase Base optimization (PBO) algorithm, Particle Swarm optimization (PSO) algorithm and Dragonfly algorithm (DA) based big data optimization techniques. From the experimental studies, it was understood that the newly introduced ABC algorithm based technique is capable of producing better or at least promising results compared to the mentioned big data optimization techniques for all of the benchmark instances. (C) 2019 Elsevier B.V. All rights reserved.
Executing complex and time-sensitive operations has become difficult due to the increased acceptance of Internet of Things (IoT) devices and IoT-generated bigdata, which can result in problems with power consumption ...
详细信息
The digital age has added a new term to the literature of information and computer sciences called as the bigdata in recent years. Because of the individual properties of the newly introduced term, the definitions of...
详细信息
The digital age has added a new term to the literature of information and computer sciences called as the bigdata in recent years. Because of the individual properties of the newly introduced term, the definitions of the data-intensive problems including optimization problems have been substantially changed and investigations about the solving capabilities of the existing techniques and then developing their specialized variants for big data optimizations have become important research topic. Artificial Bee Colony (ABC) algorithm inspired by the clever foraging characteristics of the real honey bees is one of the most successful swarm intelligence-based metaheuristics. in this study, a new ABC algorithm-based technique that is named source-linked ABC (slinkABC) was proposed by considering the properties of the optimization problems related with the bigdata. The slinkABC algorithm was tested on the big data optimization problems presented at the Congress on Evolutionary Computation (CEC) 2015 big data optimization Competition. The results obtained from the experimental studies were compared with the different variants of the ABC algorithm including gbest-guided ABC (GABC), ABC/best/1, ABC/best/2, crossover ABC (CABC), converge-onlookers ABC (COABC), quick ABC (qABC) and modified gbest-guided ABC (MGABC) algorithms. In addition to these, the results of the proposed ABC algorithm were also compared with the results of the Differential Evolution (DE) algorithm, Genetic algorithm (GA), Firefly algorithm (FA), Phase-Based optimization (PBO) algorithm and Particle Swarm optimization (PSO) algorithm-based approaches. From the experimental studies, it was understood that the ABC algorithm modified by considering the unique properties of the big data optimization problems as in the slinkABC produces better solutions for most of the tested instances compared to the mentioned optimization techniques.
big data optimization (big-Opt) refers to optimization problems which require to manage the properties of bigdata analytics. In the present paper, the Search Manager (SM), a recently proposed framework for hybridizin...
详细信息
big data optimization (big-Opt) refers to optimization problems which require to manage the properties of bigdata analytics. In the present paper, the Search Manager (SM), a recently proposed framework for hybridizing metaheuristics to improve the performance of optimization algorithms, is extended for multi-objective problems (MOSM), and then five configurations of it by combination of different search strategies are proposed to solve the EEG signal analysis problem which is a member of the big data optimization problems class. Experimental results demonstrate that the proposed configurations of MOSM are efficient in this kind of problems. The configurations are also compared with NSGA-III with uniform crossover and adaptive mutation operators (NSGA-III UCAM), which is a recently proposed method for big-Opt problems. (C) 2019 Elsevier B.V. All rights reserved.
In this thesis, we discuss and develop randomized algorithms for bigdata problems. In particular, we study the finite-sum optimization with newly emerged variance- reduction optimization methods (Chapter 2), explore ...
详细信息
In this thesis, we discuss and develop randomized algorithms for bigdata problems. In particular, we study the finite-sum optimization with newly emerged variance- reduction optimization methods (Chapter 2), explore the efficiency of second-order information applied to both convex and non-convex finite-sum objectives (Chapter 3) and employ the fast first-order method in power system problems (Chapter 4). In Chapter 2, we propose two variance-reduced gradient algorithms – mS2GD and SARAH. mS2GD incorporates a mini-batching scheme for improving the theoretical complexity and practical performance of SVRG/S2GD, aiming to minimize a strongly convex function represented as the sum of an average of a large number of smooth con- vex functions and a simple non-smooth convex regularizer. While SARAH, short for StochAstic Recursive grAdient algoritHm and using a stochastic recursive gradient, targets at minimizing the average of a large number of smooth functions for both con- vex and non-convex cases. Both methods fall into the category of variance-reduction optimization, and obtain a total complexity of O ((n + κ)log(1/ε)) to achieve an ε -accuracy solution for strongly convex objectives, while SARAH also maintains a sub-linear convergence for non-convex problems. Meanwhile, SARAH has a practical variant SARAH+ due to its linear convergence of the expected stochastic gradients in inner loops. In Chapter 3, we declare that randomized batches can be applied with second- order information, as to improve upon convergence in both theory and practice, with a framework of L-BFGS as a novel approach to finite-sum optimization problems. We provide theoretical analyses for both convex and non-convex objectives. Meanwhile, we propose LBFGS-F as a variant where Fisher information matrix is used instead of Hessian information, and prove it applicable to a distributed environment within the popular applications of least-square and cross-entropy losses. In Chapter 4, we develop fast rand
One of the major challenges of solving big data optimization problems via traditional multi-objective evolutionary algorithms (MOEAs) is their high computational costs. This issue has been efficiently tackled by non -...
详细信息
One of the major challenges of solving big data optimization problems via traditional multi-objective evolutionary algorithms (MOEAs) is their high computational costs. This issue has been efficiently tackled by non -dominated sorting genetic algorithm, the third version, (NSGA-111). On the other hand, a concern about the NSGA-Ill algorithm is that it uses a fixed rate for mutation operator. To cope with this issue, this study introduces an adaptive mutation operator to enhance the performance of the standard NSGA-111 algorithm. The proposed adaptive mutation operator strategy is evaluated using three crossover operators of NSGA-111 including simulated binary crossover (SBX), uniform crossover (UC) and single point crossover (SI). Subsequently, three improved NSGA-111 algorithms (NSGA-111 SBXAM, NSGA-III SIAM, and NSGAIII UCAM) are developed. These enhanced algorithms are then implemented to solve a number of big data optimization problems. Experimental results indicate that NSGA-III with UC and adaptive mutation operator outperforms the other NSGA-111 algorithms. (C) 2018 Elsevier B.V. All rights reserved.
暂无评论