Reliable measurements of effluent quality are important for different operational tasks such as process monitoring, online simulation, and advanced control in the wastewater treatment process (WWTP). A kernel principa...
详细信息
Reliable measurements of effluent quality are important for different operational tasks such as process monitoring, online simulation, and advanced control in the wastewater treatment process (WWTP). A kernel principal component analysis (KPCA) and extreme learning machine (ELM) based ensemble soft sensing model for effluent quality prediction was proposed. KPCA was used to extract nonlinear feature of input space to overcome high dimension and colinearity. ELM algorithm is inserted into the ensemble frame as a component model since ELM runs much faster and provides better generalization performance than the other popular learning algorithm. The average output of all the ELM components in the ensemble is the final estimation of the effluent quality index. Simulations results using industrial processdata show that the reliability and accuracy based KPCA and ELM ensemble soft sensing outperform the ELM, ELM ensemble model. (C) 2011 Published by Elsevier Ltd. Selection and/or peer-review under responsibility of [CEIS 2011]
The aim of this work is to design a fault-detection and identification system for an industrial-scale Ultrafiltration process and to propose the adopted methodology to other membrane applications. The model was create...
详细信息
Detailed information about the objects, processes and phenomena of transport infrastructure is needed to exploit the opportunities inherent in modern methods, mathematical models, algorithms and control systems of urb...
详细信息
The development of next-generation battery management systems needs models with enhanced performance to enable advanced control, diagnostic, and prognostic techniques for improving the safety and performance of lithiu...
详细信息
ISBN:
(纸本)9798350382662;9798350382655
The development of next-generation battery management systems needs models with enhanced performance to enable advanced control, diagnostic, and prognostic techniques for improving the safety and performance of lithium-ion battery systems. Specifically, battery models must deliver efficient and accurate predictions of physical internal states and output voltage, despite the inevitable presence of various system uncertainties. To facilitate this, we propose a lightweight hybrid modeling framework that couples a high-fidelity physics-based electrochemical battery model with a computationally-efficient Gaussian process regression (GPR) machine learning model to predict and compensate for errors in the electrochemical model output. This is the first time that GPR has been implemented to predict the output residual of an electrochemical battery model, which is significant for the following reasons. First, we demonstrate that GPR is capable of considerably improving output prediction accuracy, as evidenced by an observed average root-mean-square prediction error of 7.3 mV across six testing profiles, versus 119 mV for the standalone electrochemical model. Second, we employ a data sampling procedure to exhibit how GPR can use sparse training data to deliver accurate predictions at minimal computational expense. Our framework yielded a ratio of computation time to modeled time of 0.003, indicating ample suitability for online applications.
The CLAMM model provides a simulation of the growth and movement of Microcystis, including temperature-correction of the main process rates;loss of colonies;net chlorophyll-a production;changes in colony size and wind...
详细信息
The CLAMM model provides a simulation of the growth and movement of Microcystis, including temperature-correction of the main process rates;loss of colonies;net chlorophyll-a production;changes in colony size and wind induced lake mixing. This paper presents the results of the sensitivity analysis and trial application of the CLAMM model. Sensitivity analysis was undertaken using the complex `general sensitivity analysis' procedure. Results indicate a high degree of interaction between model parameters with no single parameter appearing dominant. Application of the model to a lowland impoundment in Southern England for 1994 and 1995 produced results that compared reasonably well with field data.
The relevance of the research is due to the need to stabilize the composition of the melting products of copper-nickel sulfide raw materials in the Vanyukov furnace. The goal of this research is to identify the most s...
详细信息
The relevance of the research is due to the need to stabilize the composition of the melting products of copper-nickel sulfide raw materials in the Vanyukov furnace. The goal of this research is to identify the most suitable methods for the aggregation of the real time data for the development of a mathematical model for control of the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace. Statistical methods of analyzing the historical data of the real technological object and the correlation analysis of process parameters are described. Factors that exert the greatest influence on the main output parameter (copper content in matte) and ensure the physical-chemical transformations are revealed. An approach to the processing of the real time data for the development of a mathematical model for control of the melting process is proposed. The stages of processing the real time information are considered. The adopted methodology for the aggregation of data suitable for the development of a control model for the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace allows us to interpret the obtained results for their further practical application.
The analysis of several algorithms and data structures can be reduced to the analysis of the following greedy "peeling" process: start with a random hypergraph;find a vertex of degree at most k, and remove i...
详细信息
ISBN:
(纸本)9781467345392;9781467345378
The analysis of several algorithms and data structures can be reduced to the analysis of the following greedy "peeling" process: start with a random hypergraph;find a vertex of degree at most k, and remove it and all of its adjacent hyperedges from the graph;repeat until there is no suitable vertex. This specific process finds the k-core of a hypergraph, and variations on this theme have proven useful in analyzing for example decoding from low-density parity-check codes, several hash-based data structures such as cuckoo hashing, and algorithms for satisfiability of random formulae. This approach can be analyzed several ways, with two common approaches being via a corresponding branching process or a fluid limit family of differential equations. In this paper, we make note of an interesting aspect of these types of processes: the results are generally the same when the randomness is structured in the manner of double hashing. This phenomenon allows us to use less randomness and simplify the implementation for several hash-based data structures and algorithms. We explore this approach from both an empirical and theoretical perspective, examining theoretical justifications as well as simulation results for specific problems.
With the evolutions in sensing technologies and the increasing use of advanced processcontrol techniques, terabytes of data are recorded today during the manufacturing process of semiconductor devices. These large am...
详细信息
ISBN:
(纸本)9783033039629
With the evolutions in sensing technologies and the increasing use of advanced processcontrol techniques, terabytes of data are recorded today during the manufacturing process of semiconductor devices. These large amount of data are then operated by Fault Detection and Classification (FDC) systems to assess the overall condition of production equipment. However, specific characteristics of semiconductor manufacturing such as highly correlated parameters, time-varying behaviors, or the large number of operating conditions tend to limit the efficiency of current indicators to detect and diagnose a failure occurence. There is therefore a significant requirement for the development and application of new methodologies to improve detection efficiency while reducing the complexity of condition monitoring, without losing detailed insight for efficient failure analysis. In this paper, we use data pretreatment algorithms from signal processing and time series analysis, and Multiway Principal Components analysis (MPCA) methods to accurately represent equipment behavior and process dynamics and thus overcome issues inherent to semiconductor manufacturing context. A real-case application on a plasma etcher from STMicroelectronics Rousset 8' fab is proposed to highlight benefits of these methods.
An undergraduate laboratory has recently been developed to expose junior mechanical engineering students to the concept, components and operation of data acquisition boards, PC-based data acquisition, analysis and con...
详细信息
An undergraduate laboratory has recently been developed to expose junior mechanical engineering students to the concept, components and operation of data acquisition boards, PC-based data acquisition, analysis and control. The laboratory, which has been taught during the last two fall semesters, consists of the compression testing of golf ball cores using a Riehle compression tester, and analysis of the collected data using digital signal processing techniques and statistical processcontrol (SPC). Student feedback to date suggests the labs have had a positive impact on student learning and underline the importance of including the topic of PC-based data acquisition and analysis in an undergraduate engineering curriculum.
The current tendencies in CAD, CAM an AM significantly influence the product design process assigning the model a completely new importance. The modeling and design of systems require inter- and multi-disciplinary kno...
详细信息
ISBN:
(纸本)9781728112138
The current tendencies in CAD, CAM an AM significantly influence the product design process assigning the model a completely new importance. The modeling and design of systems require inter- and multi-disciplinary knowledge and capabilities beyond the simple usage of the analysis and simulation methods. Although the expectations of involved subjects differs and they approach the problems from different points of view, praxis oriented and harmonized methods for streamlining the concurrent development based on theoretical backgrounds, but ever more knowledge about the applicability of a methodology in a particular situation are required, in order to use the modeling tools appropriately, assure effectiveness and competiveness as well as to be able to make the adequate strategic decisions. The methodology used in software engineering for purposes in the modeling and design of the mechatronic systems is analyzed in term of feasibility and applicability. This paper comprises ideas, and observes the task of modeling in a mechatronic project considering different abstraction levels bearing in mind the model as a part of the virtual engineering space.
暂无评论