genetic algorithms are now being used in new fields such as machine learning and job scheduling. Among these advancements, the most important is building applications employing problem-specific representations, operat...
详细信息
genetic algorithms are now being used in new fields such as machine learning and job scheduling. Among these advancements, the most important is building applications employing problem-specific representations, operators including heuristics. Several experiments reported in this paper, were performed using GenET, a genetic algorithm implementing different variants of the basic model. This system was designed to speed up problem-specific genetic algorithm applications.
genetic algorithms are exploratory procedures that are often able to locate near optimal solutions to complex problems. To do this, a genetic algorithm maintains a set of trial solutions, and forces them to evolve tow...
详细信息
genetic algorithms are exploratory procedures that are often able to locate near optimal solutions to complex problems. To do this, a genetic algorithm maintains a set of trial solutions, and forces them to evolve towards an acceptable solution. First, a representation for possible solutions must be developed. Then, starting with an initial random population and employing survival-of-the-fittest and exploiting old knowledge in the gene pool, each generation's ability to solve the problem should improve. This is achieved through a four-step process involving evaluation, reproduction, recombination, and mutation. As an application the author developed a genetic algorithm to train a product neural network for predicting the optimum transistor width in a CMOS switch, given the operating conditions and desired conductance.< >
genetic algorithms, invented by J.H. Holland, emulate biological evolution in the computer and try to build programs that can adapt by themselves to perform a given function. In some sense, they are ana-logous to neur...
详细信息
This paper presents a new ensemble ecosystem model (EEM) which predicts the impacts of species reintroductions and optimises potential future management interventions at the Knepp Estate rewilding project, UK. Compare...
详细信息
This paper presents a new ensemble ecosystem model (EEM) which predicts the impacts of species reintroductions and optimises potential future management interventions at the Knepp Estate rewilding project, UK. Compared to other EEMs, Knepp has a relatively high level of data availability that can be used to constrain the model, including time-series abundance data and expert knowledge. This could improve the realism of outputs and enable more nuanced and context-specific management intervention recommendations. Calibrating EEMs can be challenging, however, and as the number of constraints increases, so does the complexity of the model fitting process. We use a new genetic Algorithm-Approximate Bayesian Computation (GA-ABC) approach wherein GA outputs are used to inform the prior distributions for ABC. To reduce the parameter search space, we fixed twelve parameters - the consumer self-interaction strengths alpha i,i and negative growth rates - based on theoretical assumptions. While the GA-ABC method proved effective at efficiently searching the parameter space and optimising multiple constraints, it was computationally intensive and struggled to identify a broad range of outputs. Ultimately, this led to an ensemble of models with similar trajectories. Several potential ways to address this are discussed. Our results reinforce the findings of previous studies that the EEM methodology has potential for guiding conservation management and decision-making. Outputs suggest that reintroducing large herbivores was key to maintaining a diverse grassland-scrubland-woodland ecosystem, and optimisation experiments informed species characteristics and stocking densities needed to achieve specific goals. Ultimately, refining the EEM methodology to improve calibration and facilitate the integration of additional data will enhance its utility for ecosystem management, helping to achieve more effective and informed outcomes.
This paper proposes a hybrid approach to warehouse management, combining genetic Algorithm (GA) optimization for picking routes with Deep Neural Network (DNN) classification for order batching. The GA-DNN method signi...
详细信息
This paper proposes a hybrid approach to warehouse management, combining genetic Algorithm (GA) optimization for picking routes with Deep Neural Network (DNN) classification for order batching. The GA-DNN method significantly reduces travel distances while lowering computational time. Experimental results demonstrate up to 69.01% reduction in travel distance in large warehouses, outperforming traditional methods. This study underscores the effectiveness of GA-DNN solutions in addressing modern logistics challenges.
Photovoltaic systems are among the most popular renewable energy sources due to their ease of installation and low operating costs. However, they are characterized by low efficiency, non-linear electrical properties, ...
详细信息
Photovoltaic systems are among the most popular renewable energy sources due to their ease of installation and low operating costs. However, they are characterized by low efficiency, non-linear electrical properties, and sensitivity to radiant intensity on the panels. To address these limitations, researchers have focused on improving the efficiency of these systems. The most effective method for enhancing the maximum power point of a PV array is reconfiguration, which involves rearranging the connection structures of the panels. This study presents a method for determining the reconfiguration of panels based on their radiant intensity using a genetic algorithm (GA). The method matches the rows of the PV array to achieve similar radiant intensities, thereby increasing power efficiency. An algorithm was developed to enable the adaptive panels to connect to any row of the fixed section in a PV array divided into dual-adaptive and fixed sections, controlling this connection structure. This GA-based algorithm utilizes short-circuit currents obtained from specific points of the PV array to identify the most suitable connection structure within the solution space and generates control signals for reconfiguration. Simulation results with various array structures and shading scenarios demonstrate that the proposed method increases array efficiency and achieves results within a practically applicable cycle time.
Missing data is a persistent challenge in wastewater analysis, often leading to biased results and reduced accuracy. This study introduces an innovative Automated Machine Learning (AutoML) framework that combines deep...
详细信息
Missing data is a persistent challenge in wastewater analysis, often leading to biased results and reduced accuracy. This study introduces an innovative Automated Machine Learning (AutoML) framework that combines deep learning-based variational autoencoders (VAEs) and genetic algorithms (GAs) to address this issue. VAEs are employed to impute missing values by learning latent data representations, while GAs optimize the VAE architecture and hyperparameters, including the size of the latent space. The framework is specifically designed to handle the complex and nonlinear relationships in wastewater datasets. The framework was trained and validated using data from a full-scale water resource recovery facility. The imputed data from the optimized VAE, developed using the GA-based AutoML framework, is then used to train predictive models. Experimental evaluations demonstrate the effectiveness of the proposed approach over traditional imputation methods. The results reveal that the models can accurately predict key variables such as ammonia nitrogen (NH4-N), nitrate nitrogen (NO3-N), pH, and biogas flow rate, using imputed data. The scalability and adaptability of this framework make it valuable for real-time wastewater monitoring and predictive analytics.
Inspired by physics-informed neural networks (PINNs) inheriting both the interpretability of physical laws and the efficient integration capability of machine learning, we propose a framework based on stoichiometric a...
详细信息
Inspired by physics-informed neural networks (PINNs) inheriting both the interpretability of physical laws and the efficient integration capability of machine learning, we propose a framework based on stoichiometric ablation for LIBS spectral normalization, encoding physical constraints between LIBS intensities and shockwave characteristics (temperature Tshock and pressure P) into optimization algorithms with multiple independent objectives, named physics-informed genetic algorithms (PIGAs). It is characterized by its applicability to the wider laser energy range, covering laser-induced breakdown to significant plasma shielding and spectral lines undergoing self-absorption, outperforming the widely used physical linear or multivariate data-driven normalization methods. The home-made end-to-end LAP-RTE codes serve as the benchmark to validate the physical reciprocal-logarithmic transformation and its extensibility to self-absorption spectral lines for PIGAs. Next, experimental spectral lines are statistically used to validate PIGAs' correction effects;the median RSDs of spectral intensities can be effectively reduced by 85% (corrected by P) and 88% (corrected by Tshock) for 108 Fe I lines, while for 33 Fe II lines, reduced by 77% (corrected by P) and 86% (corrected by Tshock). Seventeen self-absorption lines are also corrected effectively, with RSDs being reduced by 78% (corrected by P) and 89% (corrected by Tshock). Our proposed idea of combining optimization methods to quantify unknown parameters in normalization strategies can also be extended to excavate the correlation between parameters for other low-temperature plasma fields with similar processes.
This study employs a genetic algorithm to optimize the parameters of the Third Order Exponential Smoothing model for predicting on the real-time traffic datasets of the Numenta Anomaly Benchmark (NAB). The genetic alg...
详细信息
This study employs a genetic algorithm to optimize the parameters of the Third Order Exponential Smoothing model for predicting on the real-time traffic datasets of the Numenta Anomaly Benchmark (NAB). The genetic algorithm process was executed with different population sizes and gene sets. In addition, a parameter sensitivity analysis was conducted, through which the ideal number of genes and population size providing the best results within the specified range were determined. Moreover, a novel approach incorporating meta-optimization techniques is proposed to enhance the efficiency of the genetic algorithm optimization process, aiming to achieve improved accuracy in anomaly detection. The proposed methodology has been tested on various traffic data scenarios across different datasets to detect deviations critical to traffic management systems. Performance comparisons using the NAB scoring system demonstrate that the method developed in this study outperforms the majority of existing NAB algorithms, as well as the contemporary approaches of Isolation Forest, Multi-Layer Perceptron Regressor (MLPRegressor), and hybrid K-Nearest Neighbors - Gaussian Mixture Models (KNN + GMM), and is competitive with leading algorithms. The proposed approach, which achieved scores of 54.41 for 'Standard', 53.95 for 'reward_low_FP_rate', and 69.61 for 'reward_low_FN_rate', indicates improvements of 3.67%, 4.45%, and 2.63%, respectively, compared to the average scores of the NAB algorithms. The findings indicate that the proposed approach not only detects anomalies with high precision but also dynamically adapts to changing data characteristics without requiring manual recalibration. This study proposes a robust traffic anomaly detection method that ensures reliable monitoring and potentially facilitates effective traffic management and *** results of the study can be extended to other areas requiring real-time data monitoring and anomaly detection, offering a scalable soluti
Internet of Things (IoT) networks generate significant traffic, requiring careful monitoring due to the large number of connected devices and their continuous data communication. Edge servers can provide effective mon...
详细信息
Internet of Things (IoT) networks generate significant traffic, requiring careful monitoring due to the large number of connected devices and their continuous data communication. Edge servers can provide effective monitoring. However, effectively managing this traffic represents a major challenge due to the diversity of devices, unpredictable data fluctuations, and uneven server utilization. This paper proposes an innovative method to optimize load balancing across servers to ensure uniform traffic monitoring. We divide the traffic load by intelligently grouping machines so that servers have to monitor the same amount of traffic. Our approach uses a deep learning technique to anticipate future traffic variations and a genetic algorithm to intelligently distribute the load between servers according to the predicted variations. Simulation results demonstrate the effectiveness of our approach to adaptive traffic management in IoT networks.
暂无评论