This research investigates the application of a novel Cheetah optimization (CO) algorithm for optimal load allocation in a standalone solar photovoltaic (PV) system integrated with an electric vehicle (EV) battery. Un...
详细信息
This research investigates the application of a novel Cheetah optimization (CO) algorithm for optimal load allocation in a standalone solar photovoltaic (PV) system integrated with an electric vehicle (EV) battery. Unlike conventional MPPT techniques like Perturb & Observe, Grey Wolf optimization (GWO), and Flying Squirrel optimization (FSO), the CO algorithm, inspired by the hunting strategies of cheetahs, effectively balances exploration and exploitation of the solution space, leading to improved convergence and optimal load allocation. Simulation studies, validated by hardware-in-the-loop (HIL) testing using Opal-RT, demonstrated the superior performance of CO. CO achieved a 15% higher peak PV power (reaching 1085 W) compared to the best-performing alternative, with a tracking efficiency exceeding 99.997%. Load power delivery was enhanced by 20%, while the CO algorithm optimized battery charging with a 5% faster charging rate and a higher peak State-of-Charge (SOC). Furthermore, CO exhibited improved system stability with a voltage deflection of only 0.65% and allowable battery voltage deviation maintained within 2%. The system demonstrated low Total Harmonic Distortion (THD): 2.12% for PV power, 3.2% for load power, and 1.96% for battery power, significantly lower than other methods. Statistical analysis, including comparisons of mean values and standard deviations, further confirmed the superiority of CO in terms of energy utilization and system stability. These findings highlight the potential of the CO algorithm as a highly effective and promising solution for optimizing load allocation in standalone solar PV systems with EV battery integration, enabling improved energy utilization, reduced reliance on grid power, and enhanced system reliability.
Missing data remain the common issue experienced in the real-world environment, which leads to deviation in data analysis and mining. Therefore, in order to lessen the consequences of missing data caused by human mist...
详细信息
Missing data remain the common issue experienced in the real-world environment, which leads to deviation in data analysis and mining. Therefore, in order to lessen the consequences of missing data caused by human mistake, missing data imputation must be used in data processing. The traditional imputation model fails to satisfy the evaluation requirement due to its poor stability and low accuracy. Further, these models compromise the imputation accuracy of the increasing number of missing information. Hence, in this research, an optimized missing data imputation model is proposed using the Socio-hawk optimization Deep Neural Network (DNN). In this research, the DNN extracts the important features from the data, in which the missing data are estimated with an arbitrary missing pattern. It is stated that whenever the hyperparameters are tuned properly, the DNN's performance is improved. The key here is the efficient training of DNN using the suggested Socio-hawk optimization, which improves the imputation model's accuracy. To determine how well the suggested imputation model imputes missing data, it is compared to other methods. As a result, the paper's primary contribution is to effectively train DNN using the suggested Socio-hawk optimization that reduces the error rate of the imputation model. The experimental evaluation shows that the proposed missing data imputation model attains a high performance at 90%, which provides 1.0595, 1.9919, and 0.9421 of MAE, MSE, and MAPE.
Dry-wet cycles can cause significant deterioration of compacted loess and thus affect the safety of fill slopes. The discrete element method (DEM) can take into account the non-homogeneous, discontinuous, and anisotro...
详细信息
Dry-wet cycles can cause significant deterioration of compacted loess and thus affect the safety of fill slopes. The discrete element method (DEM) can take into account the non-homogeneous, discontinuous, and anisotropic nature of the geotechnical medium, which is more capable of reflecting the mechanism and process of instability in slope stability analysis. Therefore, this paper proposes to use the DEM to analyze the stability of compacted loess slopes under dry-wet cycles. Firstly, to solve the complex calibration problem between macro and mesoscopic parameters in DEM models, an efficient parameter optimization method was proposed by introducing the chaotic particle swarm optimization with sigmoid-based acceleration coefficients algorithm (CPSOS). Secondly, during the parameter calibration, a new indicator, the bonding ratio (BR), was proposed to characterize the development of pores and cracks in compacted loess during dry-wet cycles, to reflect the impact of dry-wet action on the degradation of bonding between loess aggregates. Finally, according to the results of parameter calibration, the stability analysis model of compacted loess slope under dry-wet cycling was established. The results show that the proposed optimization calibration method can accurately reflect the trend of the stress-strain curve and strength of the actual test results under dry-wet cycles, and the BR also reflects the degradation effect of dry-wet cycles on compacted loess. The slope stability analysis shows that the DEM reflects the negative effect of dry-wet cycles on the safety factor of compacted loess slopes, as well as the trend of gradual stabilization with dry-wet cycles. The comparison with the finite element analysis results verified the accuracy of the discrete element slope stability analysis.
Concrete frameworks require strong structural integrity to ensure their durability and performance. However, they are disposed to develop cracks, which can compromise their overall quality. This research presents an i...
详细信息
Concrete frameworks require strong structural integrity to ensure their durability and performance. However, they are disposed to develop cracks, which can compromise their overall quality. This research presents an innovative crack diagnosis algorithm for concrete structures that utilizes an optimized Deep Neural Network (DNN) called the Ridgelet Neural Network (RNN). The RNN model was then adjusted with a new advanced version of the Human Evolutionary optimization (AHEO) algorithm that is introduced in this study. The AHEO as a new method combines human intelligence and evolutionary principles to optimize the RNN model. To train the model, an image dataset has been used, consisting of labeled images categorized as either "cracks" or "no-cracks". The AHEO algorithm has been employed to refine the network's weights, adjust the output layer for binary classification, and enhance the dataset through stochastic rotational augmentation. The effectiveness of the RNN/AHEO model was evaluated using various metrics and compared to existing methods. The model's performance is evaluated by metrics such as accuracy, precision, recall, and F1-score, and is compared to existing methods including CNN, CrackUnet, R-CNN, DCNN, and U-Net, achieving an accuracy of 99.665% and an F1-score of 99.035%. The results demonstrated that the RNN/AHEO model outperformed other approaches in detecting concrete cracks. This innovative solution provides a robust method for maintaining the structural integrity of concrete frameworks.
PurposeThis paper aims to improve the selection process of heavy machinery in construction projects by developing an advanced optimization technique using the Improved Particle Swarm optimization algorithm (IPSOM). Th...
详细信息
PurposeThis paper aims to improve the selection process of heavy machinery in construction projects by developing an advanced optimization technique using the Improved Particle Swarm optimization algorithm (IPSOM). The main objectives of such a study are to optimize the key parameters of time and cost while ensuring adherence to a predefined quality benchmarks, thereby facilitating more informed and balanced decision-making in construction ***/methodology/approachA rigorous methodology was applied to identify the relevant optimization parameters, combining a comprehensive literature review with consultations with industry experts. This approach identified the most influential factors affecting machinery selection, ensuring the model's applicability and relevance across different project scales and complexities. Unique to this study, the model's novelty lies in its advanced application of the IPSOM tailored to the construction industry's specific needs, offering a systematic approach to balancing time, cost and quality *** was validated through a detailed case study, which provided empirical evidence of the model's effectiveness in a real-world application. The study introduces a groundbreaking approach to optimizing equipment selection in highway construction, with the dual aims of minimizing costs and project duration while maintaining high-quality standards. The model proposed in the study saved 53% time, reduced costs by 30.8% and increased quality by 26.3%, outperforming traditional equipment selection ***/valueThe case study analysis demonstrated the model's adaptability and potential as a crucial tool for decision-making in construction projects.
Gasoline blending scheduling optimization can bring significant economic and efficient benefits to ***,the optimization model is complex and difficult to build,which is a typical mixed integer nonlinear programming(MI...
详细信息
Gasoline blending scheduling optimization can bring significant economic and efficient benefits to ***,the optimization model is complex and difficult to build,which is a typical mixed integer nonlinear programming(MINLP)*** the large scale of the MINLP model,in order to improve the efficiency of the solution,the mixed integer linear programming-nonlinear programming(MILP-NLP)strategy is used to solve the *** paper uses the linear blending rules plus the blending effect correction to build the gasoline blending model,and a relaxed MILP model is constructed on this *** particle swarm optimization algorithm with niche technology(NPSO)is proposed to optimize the solution,and the high-precision soft-sensor method is used to calculate the deviation of gasoline attributes,the blending effect is dynamically corrected to ensure the accuracy of the blending effect and optimization results,thus forming a prediction-verification-reprediction closed-loop scheduling optimization strategy suitable for engineering *** optimization result of the MILP model provides a good initial *** fixing the integer variables to the MILPoptimal value,the approximate MINLP optimal solution can be obtained through a NLP *** above solution strategy has been successfully applied to the actual gasoline production case of a refinery(3.5 million tons per year),and the results show that the strategy is effective and *** optimization results based on the closed-loop scheduling optimization strategy have higher *** with the standard particle swarm optimization algorithm,NPSO algorithm improves the optimization ability and efficiency to a certain extent,effectively reduces the blending cost while ensuring the convergence speed.
In many research works, topical priorities of unvisited hyperlinks are computed based on linearly integrating topic-relevant similarities of various texts and corresponding weighted factors. However, these weighted fa...
详细信息
In many research works, topical priorities of unvisited hyperlinks are computed based on linearly integrating topic-relevant similarities of various texts and corresponding weighted factors. However, these weighted factors are determined based on the personal experience, so that these values may make topical priorities of unvisited hyperlinks serious deviations directly. To solve this problem, this paper proposes a novel focused crawler applying the cell-like membrane computing optimization algorithm (CMCFC). The CMCFC regards all weighted factors corresponding to contribution degrees of similarities of various texts as one object, and utilizes evolution regulars and communication regulars in membranes to achieve the optimal object corresponding to the optimal weighted factors, which make the root measure square error (RMS) of priorities of hyperlinks achieve the minimum. Then, it linearly integrates optimal weighted factors and corresponding topical similarities of various texts, which are computed by using a Vector Space Model (VSM), to compute priorities of unvisited hyperlinks. The CMCFC obtains more accurate unvisited URLs' priorities to guide crawlers to collect higher quality web pages. The experimental results indicate that the proposed method improves the performance of focused crawlers by intelligently determining weighted factors. In conclusion, the mentioned approach is effective and significant for focused crawlers. (c) 2013 Elsevier B.V. All rights reserved.
In order to solve these problems, this paper introduced the grey system theory (GST) method in the real-time application of intelligent traffic signal optimization (ITSO). In this paper, the deep Q-network (DQN) algor...
详细信息
In order to solve these problems, this paper introduced the grey system theory (GST) method in the real-time application of intelligent traffic signal optimization (ITSO). In this paper, the deep Q-network (DQN) algorithm was used to realize the dynamic signal light setting of real-time traffic conditions, which can improve the overall operating efficiency of the traffic system, and the PPO (Proximal Policy optimization) algorithm was used to solve the problem of the lack of real-time performance of the traditional traffic signal optimization methods. By comparing the traffic congestion index of S city before and after the application of the GST method, the paper found that the average one week before the application was 60.1%, but it dropped to 26.6% after the application. In the experimental test of average speed comparison, the speed after applying the GST method was generally higher than the value before application, and the overall speed increase was about 20 km/h. This paper emphasizes the importance of evaluating the robustness of the GST method, particularly in its ability to manage unexpected scenarios. The research concentrates on assessing four critical indicators: outlier handling, noise tolerance, handling missing data, and nonlinear coping ability.
暂无评论