A new method for polarization dependence analysis of periodic nanostructured SERS substrates was proposed. The method combines the Fourier transform and the wave vector matching model of surface plasmon polariton (SPP...
详细信息
The detection of sensor faults has proven to be easier through data-driven methods which rely on historical data collected from sensors that are placed at various locations in a process plant. Since the distribution o...
详细信息
The detection of sensor faults has proven to be easier through data-driven methods which rely on historical data collected from sensors that are placed at various locations in a process plant. Since the distribution of industrial process variables is random and non-Gaussian, the independent component analysis (ICA) method has been better suited for fault detection (FD) problems. Whenever data comes with any level of noise, there is difficulty in separating useful information, which hence degrades the monitoring quality of an FD strategy. In this paper, the robustness of FD strategies is assessed for different noise realizations of sensor data using stochastic simulations. The main objective of this work is to demonstrate that ICA-based FD strategies are more robust for different noise levels in comparison with principal component analysis (PCA). The ICA modeling algorithm is improved to avoid random initialization of a de-mixing orthogonal matrix during computation of independent components. Two case studies are considered for evaluating the robustness of FD strategies: a simulated quadruple tank process and a simulated distillation column process. Comparisons have been carried out between ICA, dynamic ICA, modified ICA and PCA strategies for different sensor noise levels. The simulation results reveal that ICA-based FD strategies over-perform PCA FD strategy in monitoring sensor faults for different levels of noise.
To the actual date, computer vision and neural network dataanalysis are actively developing and finding more and more applications in the industrial control engineering. These technologies are now used to deal with i...
详细信息
Within the National Weather Service's Unified Forecast System (UFS), snow depth and snow cover observations are assimilated once daily using a rule-based method designed to correct for gross errors. While this app...
详细信息
Within the National Weather Service's Unified Forecast System (UFS), snow depth and snow cover observations are assimilated once daily using a rule-based method designed to correct for gross errors. While this approach improved the forecasts over its predecessors, it is now quite outdated and is likely to result in suboptimal analysis. We have then implemented and evaluated a snow data assimilation using the 2D optimal interpolation (OI) method, which accounts for model and observation errors and their spatial correlations as a function of distances between the observations and model grid cells. The performance of the OI was evaluated by assimilating daily snow depth observations from the Global Historical Climatology Network (GHCN) and the Interactive Multisensor Snow and Ice Mapping System (IMS) snow cover data into the UFS, from October 2019 to March 2020. Compared to the controlanalysis, which is very similar to the method currently in operational use, the OI improves the forecast snow depth and snow cover. For instance, the unbiased snow depth root-mean-squared error (ubRMSE) was reduced by 45 mm and the snow cover hit rate increased by 4%. This leads to modest improvements to globally averaged near-surface temperature (an average reduction of 0.23 K in temperature bias), with significant local improvements in some regions (much of Asia, the central United States). The reduction in near-surface temperature error was primarily caused by improved snow cover fraction from the data assimilation. Based on these results, the OI DA is currently being transitioned into operational use for the UFS. Significance StatementWeather and climate forecasting systems rely on accurate modeling of the evolution of atmospheric, oceanic, and land processes. In addition, model forecasts are substantially improved by continuous incorporation of observations to models, through a process called data assimilation. In this work, we upgraded the snow data assimilation used in the U.S. Na
The environmental temperature adaptability of electric vehicles is a key pain point issue in their development. Both high and low temperature environments have significant impact on vehicle endurance. Also, range degr...
详细信息
The topic of this study is the selection of an appropriate battery size for a microgrid that employs renewable energy sources, such as solar photovoltaic (PV) systems and wind turbines. The time-series historical data...
详细信息
The development of next-generation battery management systems needs models with enhanced performance to enable advanced control, diagnostic, and prognostic techniques for improving the safety and performance of lithiu...
详细信息
ISBN:
(数字)9798350382655
ISBN:
(纸本)9798350382662
The development of next-generation battery management systems needs models with enhanced performance to enable advanced control, diagnostic, and prognostic techniques for improving the safety and performance of lithium-ion battery systems. Specifically, battery models must deliver efficient and accurate predictions of physical internal states and output voltage, despite the inevitable presence of various system uncertainties. To facilitate this, we propose a lightweight hybrid modeling framework that couples a high-fidelity physics-based electrochemical battery model with a computationally-efficient Gaussian process regression (GPR) machine learning model to predict and compensate for errors in the electrochemical model output. This is the first time that GPR has been implemented to predict the output residual of an electrochemical battery model, which is significant for the following reasons. First, we demonstrate that GPR is capable of considerably improving output prediction accuracy, as evidenced by an observed average root-mean-square prediction error of 7.3 mV across six testing profiles, versus 119 mV for the standalone electrochemical model. Second, we employ a data sampling procedure to exhibit how GPR can use sparse training data to deliver accurate predictions at minimal computational expense. Our framework yielded a ratio of computation time to modeled time of 0.003, indicating ample suitability for online applications.
Additive manufacturing (AM) for metals is rapidly transitioning to an accepted production technology, which has led to increasing demands for dataanalysis and software tools. The performance of laser-based powder bed...
详细信息
Additive manufacturing (AM) for metals is rapidly transitioning to an accepted production technology, which has led to increasing demands for dataanalysis and software tools. The performance of laser-based powder bed fusion of metals (PBF-LB/M), a common metal AM process, depends on the accuracy of dataanalysis. Advances in data acquisition and analysis are being propelled by an increase in new types of in situ sensors and ex situ measurement devices. Measurements taken with these sensors and devices rapidly increase the volume, variety, and value of PBF-LB/M data but decrease the veracity of that data simultaneously. The number of new, data-driven software tools capable of analyzing, modeling, simulating, integrating, and managing that data is also increasing;however, the capabilities and accessibility of these tools vary greatly. Issues associated with these software tools are impacting the ability to manage and control PBF-LB/M processes and qualify the resulting parts. This paper investigates and summarizes the available software tools and their capabilities. Findings are then used to help derive a set of functional requirements for tools that are mapped to PBF-LB/M lifecycle activities. The activities include product design, design analysis, process planning, process monitoring, processmodeling, process simulation, and production management. PBF-LB/M users can benefit from tools implementing these functional requirements implemented by (1) shortening the lead time of developing these capabilities, (2) adopting emerging, state-of-the-art, PBF-LB/M data and data analytics methods, and (3) enhancing the previously mentioned AM product lifecycle activities.
The development process of the diesel electric control system includes multiple steps of analysis and design, modeling and simulation, code generation and real-time testing. With the development of modern technology, ...
详细信息
The proceedings contain 48 papers. The topics discussed include: models for forecasting indicators based on neural network, regression analysis, and big data;data mining for public channels and groups in telegram mess...
ISBN:
(纸本)9781510662452
The proceedings contain 48 papers. The topics discussed include: models for forecasting indicators based on neural network, regression analysis, and big data;data mining for public channels and groups in telegram messenger;two-step intelligent approach for photo image fragment forgery detection and identification;computer controlled active antenna with metamaterial;using virtual antenna array technology to analyze the electromagnetic environment;solving the handwriting recognition problem using convolutional neural networks;development of a model for analyzing the emotional sentiment of textual assessments of citizens of the digital city;software development of the surface vehicle traffic control system;modular architecture of an automated processcontrol system in greenhouse complexes;and automation of diagnosis, stratification, and treatment of the paroxysmal sympathetic hyperactivity syndrome in the smart ward environment.
暂无评论