With the rapid development of artificial intelligence, intelligent algorithms for parameter nonlinear mapping provide a new design approach to address the long-term reliance on empirical design in tunnel engineering. ...
详细信息
With the rapid development of artificial intelligence, intelligent algorithms for parameter nonlinear mapping provide a new design approach to address the long-term reliance on empirical design in tunnel engineering. This paper proposes an intelligent model for predicting support structure parameters based on tunnel background information. After comparing the characteristics of machine learning and deep learning algorithms applied in the intelligent design model, the generated model is validated using tunnel deformation indicators. The results show the overall accuracies of the machine learning CLS-PSO-SVM and deep learning HRNet algorithms used are 81.1 % and 88.5 %, respectively. Converting the maximum deformation as the only output indicator, the prediction accuracies of the vault and haunch deformations are 85.2 % and 82.8 %, respectively, verifying the reliability of the intelligent model. The research results can provide theoretical support for the intelligent design of tunnel engineering. Meanwhile, intelligent design models will develop towards finer prediction parameters in the future.
A fractional-order PID controller is a generalization of a standard PID controller using fractional calculus. Compared with the standard PID controller, two adjustable variables, "differential order" and &qu...
详细信息
A fractional-order PID controller is a generalization of a standard PID controller using fractional calculus. Compared with the standard PID controller, two adjustable variables, "differential order" and "integral order", are added to the PID controller. Fractional-order PID is more flexible, has better responses, and the precise adjustment closed-loop system stability region is larger than that of a classic PID controller. But the design and stability analysis is more complicated than for the PID controller. Therefore, the optimal setting of parameters is very important. A firefly algorithm in standard mode has only local optimization and accuracy is low. In order to fix this flaw an improved chaotic algorithm firefly is proposed for a design controller FOPID. To evaluate the performance of the proposed controller, it has been used in the control of a CSTR system with a variety of fitness functions. Simulations confirm the optimal performance of the proposed controller.
Resources scarcity has created strict requirements for efficient wood processing. Multi-spot pressure tensioning can effectively stabilize the sawblade and reduce cutting losses, but it is difficult to apply in practi...
详细信息
Resources scarcity has created strict requirements for efficient wood processing. Multi-spot pressure tensioning can effectively stabilize the sawblade and reduce cutting losses, but it is difficult to apply in practice due to the complexity of the process. In this study, the elastic-plastic solid model of multi-spot pressure tensioning was established, which was simplified to a thermal expansion shell model, and the mapping relationship between the two models was determined. The feasibility of the mapping method was verified by experiments. The backpropagation neural network (BPNN) was trained on a database composed of 8160 working conditions and the relative error was found to be less than 5%. Indenter displacement, pressing point quantity, and indenter radius were shown to change the degree of tension to varying extent. The pressing-point-distribution radius determines the development direction of sawblade performance. Increasing the number of pressing point circles can expand the adjustment range for sawblade performance. Both sawblade performance and tensioning energy consumption can be optimized by the genetic algorithm (GA). The optimal process parameters for different applications can be obtained. The combination of finite element method, BPNN, and GA can effectively optimize the multi-spot pressure tensioning process to improve the sawblade performance.
Rock cuttability has great influence on the rock excavation efficiency of TBM (tunnel boring machine). In order to evaluate rock cuttability in real time, quickly, accurately and efficiently during TBM excavating, the...
详细信息
Rock cuttability has great influence on the rock excavation efficiency of TBM (tunnel boring machine). In order to evaluate rock cuttability in real time, quickly, accurately and efficiently during TBM excavating, the relevant excavation parameters of Zagros, Kerman and Bazideraz tunnels were first collected. Then, the regression analyses between excavation parameters and rock cuttability were carried out. The two-dimensional regression analyses studied the relationship between operating parameters (thrust F and rotation speed RPM) and the characterization parameters (torque T and penetration rate PR). The three-dimensional regression analyses were utilized to create the PR and specific energy SE models based on operating parameters. The result shows that the established three-dimensional regression models have good prediction performance, and its performance is superior to two-dimensional models. Moreover, the prediction model of uniaxial compressive strength UCS and the classification model of rock cuttability were founded based on SE. The rock cuttability is divided into three levels, namely, easy (level 1), medium (level 2), and poor (level 3), in which the corresponding SE ranges are 0 to 6, 6 to 10 and exceeds 10 kWh & BULL;m(-3), respectively. Finally, the intelligent algorithms, combined with excavation parameters, were introduced to establish UCS prediction model and rock cuttability classification model, and the good prediction performance was achieved. The above studies can provide necessary references and ideas for real-time, rapid, accurate and effective evaluation of rock cuttability based on TBM excavation parameters, and has certain guiding significance for engineering application.
Understanding the loss parameters of piezoelectric materials is crucial for designing effective piezoelectric sensors. Traditional elastic loss parameter measurement techniques mainly rely on three methods: 3 dB bandw...
详细信息
Understanding the loss parameters of piezoelectric materials is crucial for designing effective piezoelectric sensors. Traditional elastic loss parameter measurement techniques mainly rely on three methods: 3 dB bandwidth, impedance fitting, and ultrasonic attenuation. However, the elastic losses obtained through these methods are constant and frequency-independent, which does not align with the actual vibration characteristics of piezoelectric materials. Therefore, there is a need for a fast, accurate, and frequency-dependent method to obtain the elastic loss of piezoelectric materials. This paper introduces an approach that utilizes intelligent algorithms for fitting impedance curve to calculate elastic loss parameters. A frequency-dependent second-order energy loss model for piezoelectric materials is established. Then, a genetic algorithm is introduced to obtain the optimal elastic loss parameters. The results demonstrate a high consistency between theoretical and experimental impedances, with an error less than 5%. The elastic loss parameters obtained through intelligent algorithm-based impedance curve fitting match well with stress experiment results, with an error less than 6%. This method provides a rapid, accurate, and cost-effective way to obtain frequency-dependent second-order elastic loss parameters for piezoelectric materials.
The transmission capacity of the power grid is growing to meet the increasing electrical load demand in the past few decades. It makes the level of severity that short-circuit current (SCC) exceeds the interruption ca...
详细信息
The transmission capacity of the power grid is growing to meet the increasing electrical load demand in the past few decades. It makes the level of severity that short-circuit current (SCC) exceeds the interruption capability of circuit breakers much higher. To limit SCC, this study proposes a bi-objective optimisation model to determine the optimal tripping transmission lines scheme. In this proposed model, the objectives include the maximal effectiveness on limiting SCC and the minimal adverseness on voltage stability, which evaluate by the sensitivities of SCC and voltage stability margin on the tripping lines. intelligence algorithms have been applied to solve the similar multi-objective problem. However, the optimisation methods cannot meet the requirement of the real-time application because the full-dimensional set of decision variables including all transmission lines is considered in the optimisation procedure. Thereby, the coordinated optimisation algorithm is adapted to accurately and fast achieve the optimal solution. The case study on the IEEE 39-bus system and a real-world 500 kV network in Guangdong province of China verify the effectiveness and practicality of the proposed method.
Traveling Salesman Problem is one of typical problems of combinatorial optimization. It is because of complexity of TSP that accurate computing couldn't find a global optimal solution in more short time or all. By...
详细信息
ISBN:
(纸本)9781424409723
Traveling Salesman Problem is one of typical problems of combinatorial optimization. It is because of complexity of TSP that accurate computing couldn't find a global optimal solution in more short time or all. By analyzing the relationship between global solutions and local optimal solutions computed using algorithms for TSP, it is found that union set of edge sets multi high-qualify local optimal solutions can include all edges of a,global optimal solution. The method, initial edge set for TSP, is put forward based on statistic principle. The search space of original problem is down greatly by utilizing new method;the quantity of initial edge set is about double times of problem scale. Accurate computing algorithms can find global optimal solution for small scale TSP based on new edge sets, and efficiency of stochastic search algorithms is improved greatly.
The data accumulation and their inhomogeneous distribution lead to the issue of large and sparse systems solving in various fields: industrials, emergency management, etc. Complex structure in the data error creates a...
详细信息
ISBN:
(纸本)9789811501999;9789811501982
The data accumulation and their inhomogeneous distribution lead to the issue of large and sparse systems solving in various fields: industrials, emergency management, etc. Complex structure in the data error creates additional risk to obtain an adequate solution. To facilitate problem-solving, we describe the technique that is based on intellectual division of data with following application of cluster algorithm and the modification of Gaussian elimination to different portions of data. In this paper, we present results of developed technique that was applied to samples of synthetic and real data. We compare them with outcomes of other algorithms (intelligence and classical) by using of numerical estimates and graphical format.
In recent years, with the rapid development of China's construction industry, various safety accidents caused by it have occurred frequently, causing huge losses to people's lives and property. In order to red...
详细信息
The application of machine learning (ML) is promising to solve the difficulty of predicting the adsorption of various organic pollutants on carbonaceous materials. This study highlights how ML advances the adsorption ...
详细信息
The application of machine learning (ML) is promising to solve the difficulty of predicting the adsorption of various organic pollutants on carbonaceous materials. This study highlights how ML advances the adsorption research, emphasizes the robust model construction specialized for presenting various application scenarios of adsorption models. We introduce, for the first time, a systematic data preparation workflow tailored for optimizing adsorption studies. Emphasis is given to addressing key challenges in data preparation, including managing adsorption datasets, preventing data leakage, and choosing descriptors wisely. Various algorithms used in 39 previous related studies were included in statistical analysis, and the applications of emerging algorithms in adsorption were prospected. For the data-driven model, the application of importance analysis is beneficial for comprehending adsorption mechanisms, transforming the black-box models into a glass-box ones. It facilitates the identification of primary features governing the adsorption of distinct emerging contaminants and the optimized design of efficient carbonaceous adsorbents. In addition, this review provides prospects for the advanced ML applications in adsorption research, such as its integration with reinforcement learning policies. We also explore the potential of ML in addressing the complexities associated with multi-component adsorption. In sum, this review offers unprecedented illumination into the opportunities and challenges posed by ML in the realm of aqueous adsorption processes.
暂无评论