Construction has suffered from stagnant productivity for decades, with only 1% growth over the past twenty years compared to the 3.6% growth in the manufacturing industry. One of the main causes of this problem is tha...
详细信息
Construction has suffered from stagnant productivity for decades, with only 1% growth over the past twenty years compared to the 3.6% growth in the manufacturing industry. One of the main causes of this problem is that the industry still relies on manual and subjective methods for modeling and managing its construction processes, which are error-prone and time-consuming. To tackle this challenge, process mining has proven to be a game-changer in managing processes effectively providing automation capabilities for several industries such as manufacturing, banking, and health care. These industries have adopted the eXtensible Event Stream (XES) and object-centric event logs (OCEL) standards to semantically structure event logs that record timestamped human-machine interactions (HMI) associated with the real execution of business processes. However, one major factor preventing a broader process mining implementation in the construction industry is the lack of a domain-specific framework to facilitate the data integration for the generation of event logs from generic data sources (i.e., relational databases). Thus, this work aims at enabling process mining capabilities in construction organizations through an event log generation framework for construction processes. The main contributions of this research work are (i) an extended event log architecture that facilitates the extraction, transformation, loading (ETL), and querying of construction data;and (ii) the development of a process mining use case related to the construction change management process. In conclusion, this study facilitates the event log generation that enables process mining analysis to get data-driven actionable process performance insights that support construction companies with strategic decision-making across the project lifecycle.
The aluminum foil mill is an important industrial production equipment. To reduce operation and maintenance costs and prevent breakdowns in the rolling mill, it is necessary to analyze and predict the data of differen...
详细信息
The Production Logistics system is generally a large-scale complex system with various operational phases and management levels that must integrate. In the specific context of soft and deformable food products, the co...
详细信息
ISBN:
(纸本)9798350362442;9798350362435
The Production Logistics system is generally a large-scale complex system with various operational phases and management levels that must integrate. In the specific context of soft and deformable food products, the core of AGILEHAND European project, this complexity increases further due to challenges related to the handling and movement of such items. Efficient coordination of production and logistics phases becomes crucial to ensure product quality, prevent losses, and optimize the entire process. In this article, focus will be placed on a data-driven framework for the automated generation of simulation models, serving as the foundation for digital twins in intelligent factories within the previously mentioned sector. The proposed framework represents a multi-layered data-driven system designed for real-time/near-real-time simulation, planning and synchronization of production and logistics systems during line reconfiguration. The digital model forms the basis for a digital twin with simulation and optimization capabilities, designed to facilitate decision-making at various management levels in the production and logistics process and control activities such as changes, maintenance, quality and safety. Exploiting information provided by the Enterprise Traceability system, the digital twin aims to establish a real-time/near-real-time information flow. This flow enables accurate capturing of dynamics occurring in the physical layer and effective assessment of their negative effects on the overall operational state of the system in the digital layer. In this context, the use of the digital twin is intended to simplify and expedite the reconfiguration of production and logistics systems. This is achieved through the early detection of system design or process sequence through cross-sectional simulation.
Accurate modeling is the basis for analyzing the dynamic response characteristics of the model. However, due to the complexity and time-varying nature of the internal mechanisms of the reactor, it is inevitable that i...
详细信息
ISBN:
(纸本)9780791888216
Accurate modeling is the basis for analyzing the dynamic response characteristics of the model. However, due to the complexity and time-varying nature of the internal mechanisms of the reactor, it is inevitable that inaccurate model parameters will be used in the modelingprocess. These will lead to discrepancies between the modeled mechanisms and the actual reactor. In this paper, the difference is evaluated and shortened by means of neural network hybrid modeling. Based on the MATLAB/Simulink simulation platform, this paper firstly obtains the parameters that have the greatest influence on the linear model through sensitivity analysis and takes them as the object of neural network correction, then obtains the data required for offline training of neural network according to the mechanism model, linear model and the deviation of the two under different working condition levels, retains the neural network weights and thresholds obtained from the offline training, and finally utilizes the gradient descent algorithm to update the neural network weights and thresholds in real time in order to achieve the online calibration of the linear model. The final results show that the hybrid model can effectively reduce the steady-state deviation between the two models, which indicates that the hybrid modeling can effectively improve the accuracy of the established model and provide a solid foundation for the subsequent design of the control system based on the linear model.
The use of simulation modeling of production activities to improve the effectiveness of management decisions allows to evaluate the quality of the decision made on the model, as well as a means of training employees. ...
详细信息
processcontrol has been established as a core course for the formation of chemical engineers. Very often, it is the only course dealing with the analysis of transient (time dependent) phenomena and conditions. It rel...
详细信息
processcontrol has been established as a core course for the formation of chemical engineers. Very often, it is the only course dealing with the analysis of transient (time dependent) phenomena and conditions. It relies on difficult concepts requiring intensive mathematical approaches and simulations based on differential equations and Laplace transform. It is commonly criticized for its level of abstraction and mathematical involvement, in contrast to other courses in the career, and for the restricted applicability to industrial jobs. This criticism generally negatively affects the motivation of students. However, the combination with hands-on experiments has proved to enrich the learning and motivation of students, but most colleges face severe restrictions on the investment, maintenance, and operation of processcontrol labs and the addition of new requirements in the curriculum. Some alternatives have been exploring the use of simple modules for classroom demonstrations, theoretical simulations of equipment in unit operations lab, and virtual-lab simulations. This paper describes the scope of technical training based on process model and synthesis of PID controllers for six experimental set-ups with liquid level and temperature control, using lab equipment fully automated for data acquisition, handling of manipulated and disturbance variables, and selection of parameters for PID controllers. MATLAB codes and Simulink graphical simulations support the processing of data and analysis of results. In addition, the course develops a unique experience in team skills and performance where every team is a combination of two sub-teams. The "office" sub-team oversees research on industrial applications, instrumentation characteristics, and computational modeling. The "lab" sub-team oversees elaborating and testing experimental plans, collecting data, and analyzing results. Every team is assigned two sequential projects;one for processmodeling (open-loop) and one for co
With the rapid increase in the complexity of electronics, the cost of the functional testing process used to ensure product functionality continues to rise. Optimization modeling based on reliability analysis is an ef...
详细信息
With the increasing complexity of 3-D semiconductor structures, the use of optical critical dimension (OCD) metrology has become a popular solution due to its accuracy and fast inference time. Machine learning has bee...
详细信息
ISBN:
(纸本)9781510672178;9781510672161
With the increasing complexity of 3-D semiconductor structures, the use of optical critical dimension (OCD) metrology has become a popular solution due to its accuracy and fast inference time. Machine learning has been widely adopted in this field to further improve the efficiency and precision of OCD metrology. Especially for high aspect ratio structures such as DRAM and VNAND, where the required computing power for physical modeling increases exponentially, the importance of machine learning with reference data is crucial. However, one significant challenge of the machine learning-based metrology under rapidly changing process condition is the limitation of available labeled data, which causes overfitting and decreases recipe reliability in the manufacturing process as the cost of wafer consumption increases. To utilize machine learning algorithms in mass production, the development of robust algorithms that can be optimized with few-shot data is required. In this paper, we propose a few-shot machine learning algorithm that includes i) wafer-level statistical information-based data augmentation and ii) anomaly detection to automatically remove data with measurement errors. The proposed algorithm shows superior accuracy, repeatability, and in-wafer uniformity compared to the benchmark algorithm in tests with manufacturing phase data. Additionally, this robustness can be sustained with the minimum amount of data in metrology, as only 9 reference training data are used on three design of experiment (DoE) wafers. The proposed optimized solution is expected to contribute to the reduction of measurement costs and production yields of highly complicated 3D semiconductor structures.
For applications that require precision with a resolution of few micrometers, determining the exact position of the workpiece relative to the processing laser beam in the XY-plane is a challenge that is currently solv...
详细信息
ISBN:
(纸本)9781510684607;9781510684614
For applications that require precision with a resolution of few micrometers, determining the exact position of the workpiece relative to the processing laser beam in the XY-plane is a challenge that is currently solved only through time-consuming and complex to integrate, as well as expensive measurement systems. Here, we address the challenge of automatically determining the position and orientation of a workpiece in the ultrashort pulse laser machine with no movement of the workpiece and no impairment of the processing optic. Our approach involves the evaluation of spatially resolved secondary process emissions while the laser passes over the surface - either during the processing or in a separate measurement step with fluence below the ablation threshold. The measurement does not affect the duration of the processing, as it happens in parallel to it. Here, we thoroughly examine the secondary emission-based position detection of a Philips Metamorpha project workpiece. We divide the process in two steps: a coarse and a fine position detection and achieve a precision of up to 2.4 mu m. Moreover, we develop an automation algorithm and implemented it in the FPGA of our edge-device, which enables the analysis of the data while it is being collected with a temporal resolution of 10 mu s.
Petri net is a mathematical model for representing parallel, asynchronous, and distributed systems. Petri nets can model parallel and synchronous activities in manufacturing systems at various levels of abstraction. I...
详细信息
Petri net is a mathematical model for representing parallel, asynchronous, and distributed systems. Petri nets can model parallel and synchronous activities in manufacturing systems at various levels of abstraction. In this study, we propose data-driven modeling and scheduling for cellular manufacturing systems using process mining with Petri nets. In the proposed method, the event log data is extracted from a virtual plant and then the Petri net model considering the movement of products and operators is developed by using the process mining technique with the Petri net model. We also derived an approximate solution for the derived Petri net model from the event log using a local search method using a Petri net simulator. The analysis and modification of the model are conducted in the proposed method. Near-optimal schedules are derived using Petri net simulations. The validity of the proposed model is evaluated.
暂无评论