Based on the analysis of the current situation and management requirements of geological data, the modelingprocess of spatial data is elaborated. Based on the characteristics of spatial data, the logical model of geo...
详细信息
Taint analysis of value flows, as a static analysis technique, has gained widespread application in the fields of software security and vulnerability mining. However, when dealing with complex programs, it still faces...
详细信息
ISBN:
(纸本)9798350344172
Taint analysis of value flows, as a static analysis technique, has gained widespread application in the fields of software security and vulnerability mining. However, when dealing with complex programs, it still faces challenges in terms of precision and performance. This research proposes P-data, a parallel framework implementing dependency-aware taint analysis. P-data employs modeling to capture data and control dependencies, reducing false positives over tools like Clang Static Analyzer and SVF. To accelerate the analysis, P-data leverages a task-level parallel framework introducing Preemption of Computational Resources (PCR) and Asynchronous Taint Source Registration, lead to impressive scalability and efficiency. Evaluations demonstrate P-data's ability to significantly expedite taint analysis for large programs using multi-core resources, achieving over 25X speedup on 32 cores. P-data makes notable contributions by boosting precision, efficiency and scalability of security-critical program analysis through advanced dependency modeling and parallelization techniques. It provides an extensible high-performance framework benefiting static analysis advancement.
With the continuous development of the Internet industry, the means of fraud have become more and more deceptive, and telecom fraud has gradually become a common means, telecom fraudsters are able to take advantages o...
详细信息
ISBN:
(纸本)9798350387780;9798350387797
With the continuous development of the Internet industry, the means of fraud have become more and more deceptive, and telecom fraud has gradually become a common means, telecom fraudsters are able to take advantages of new technological means to carry out criminal activities. Therefore, the model being able to accurately identify fraud phone numbers is demanded as it can fundamentally avoid the risk of fraud. However, the data of customers are always privacy sensitive. In order to prevent the privacy of each telecom operator's customers from being leaked in the process of modeling whether an unknown phone number is a telecom fraud number using vertical federated learning. Secure Boosting Tree (SBT) algorithm was introduced to handle the phone number data, and the optimal model was obtained through a large number of comparative experiments. Compared with the results obtained by the traditional centralized learning method, the effectiveness of federated learning is verified.
Recent research has increasingly focused on integrated statistical processcontrol (SPC) and maintenance strategies in production systems, highlighting their critical role in quality control. Most integrated modeling ...
详细信息
This paper proposes a novel robust control method for autonomous underwater vehicle (AUV) trajectory tracking, specifically addressing challenges arising from sudden changes in desired trajectories and significant mod...
详细信息
The key idea behind this study is to integrate a moving window dynamic PCA (MW-DPCA) methodology for fault detection within the Tennessee Eastman process (TEP) into a low-computational power system, the Raspberry Pi 4...
详细信息
ISBN:
(纸本)9798350373981;9798350373974
The key idea behind this study is to integrate a moving window dynamic PCA (MW-DPCA) methodology for fault detection within the Tennessee Eastman process (TEP) into a low-computational power system, the Raspberry Pi 4 card, for real-time application. Indeed, the paramount importance of real-time fault detection (FD) in intricate industrial processes presents a critical challenge. Various data-driven techniques have been developed to ensure safety, maintain operational stability, and optimize productivity in such processes. Principal Component analysis (PCA) is a fundamental data-driven technique that utilizes dimensionality reduction to extract the most informative features from high-dimensional data, simplifying analysis and potentially revealing underlying fault patterns. However, PCA primarily focuses on static relationships and may miss crucial temporal dynamics for fault identification. This is where dynamic PCA (DPCA) excels. By incorporating lagged values of variables, DPCA captures the temporal evolution of features, enabling a more comprehensive understanding of process behavior and improving the detection of faults involving dynamic changes. In order to address the stochastic measurements, a moving average filter tool is also employed. The results obtained and the successful realization of this implementation demonstrate the adaptability of the approach and pave the way for its seamless integration into practical industrial applications.
This work presents a novel adaptable framework for multi-objective optimization (MOO) in metal additive manufacturing (AM). The framework offers significant advantages by departing from the traditional design of exper...
详细信息
This work presents a novel adaptable framework for multi-objective optimization (MOO) in metal additive manufacturing (AM). The framework offers significant advantages by departing from the traditional design of experiments (DoE) and embracing surrogate-based optimization techniques for enhanced efficiency. It accommodates a wide range of process variables such as laser power, scan speed, hatch distance, and optimization objectives like porosity and surface roughness (SR), leveraging Bayesian optimization for continuous improvement. High-fidelity surrogate models are ensured through the implementation of space-filling design and Gaussian process regression. Sensitivity analysis (SA) is employed to quantify the influence of input parameters, while an evolutionary algorithm drives the MOO process. The efficacy of the framework is demonstrated by applying it to optimize SR and porosity in a case study, achieving a significant reduction in SR and porosity levels using data from existing literature. The Gaussian process model achieves a commendable cross-validation R2 score of 0.79, indicating a strong correlation between the predicted and actual values with minimal relative mean errors. Furthermore, the SA highlights the dominant role of hatch spacing in SR prediction and the balanced contribution of laser speed and power on porosity control. This adaptable framework offers significant potential to surpass existing optimization approaches by enabling a more comprehensive optimization, contributing to notable advancements in AM technology.
We propose to estimate the region of attraction (ROA) for the stability of nonlinear systems from only system measurement data and without knowledge of the system model. The key to our result is the use of Koopman ope...
详细信息
We propose to estimate the region of attraction (ROA) for the stability of nonlinear systems from only system measurement data and without knowledge of the system model. The key to our result is the use of Koopman operator theory to approximate the nonlinear dynamics in linear coordinates. This approximation is typically more accurate than the traditional Jacobian-based linearization method. We then employ the Extended Dynamic Mode Decomposition (EDMD) method to estimate the linear approximation of the system through data. This is then used to construct a Lyapunov function that helps estimate the ROA. However, this estimate is typically very conservative. The trajectory reversing method is then used on the set of points that form this conservative estimate, to enlarge the ROA approximation. The output of EDMD is also utilized in the trajectory reversing method, keeping the entire analysisdata-driven. Finally, an example is used to show the accuracy of this data-driven method, despite not knowing the system. Copyright (c) 2024 The Authors.
In microelectronic packaging, the quality of wire bonding can directly affect the quality and long-term reliability of microelectronic devices. For the reliability of wire bonding, this paper proposes a quality contro...
详细信息
ISBN:
(纸本)9798350338812;9798350338829
In microelectronic packaging, the quality of wire bonding can directly affect the quality and long-term reliability of microelectronic devices. For the reliability of wire bonding, this paper proposes a quality control method for thermal ultrasonic wire bonding based on process failure mode and effects analysis (PFMEA) and statistical processcontrol (SPC). This method conducts PFMEA analysis on thermal ultrasonic wire bonding and establishes failure mode evaluation indicators suitable for wire bonding characteristics. By introducing SPC technology during the analysisprocess, a quantitative analysis was conducted on the stability of bonding tension data, forming an effective method for controlling the quality of wire bonding. The application indicates that the weak links of the bonding process are effectively controlled, which improves the reliability of the bonding process and provides technical support for the process quality control.
The stability of production processes is critical for companies to deliver products to customers on time and to specification. In mass production, where a large number of machines and equipment are used, processes are...
详细信息
ISBN:
(纸本)9783031564468;9783031564444
The stability of production processes is critical for companies to deliver products to customers on time and to specification. In mass production, where a large number of machines and equipment are used, processes are multistage and product variety is high, the possibilities for effective analysis depend largely on the level of implementation of information techniques that can increase the availability of processdata. In turn, increased data availability provides a more complete picture of production processes, the changes that are taking place and their impact on quality and stability. With the increasing number of information techniques available, the amount of data generated by the manufacturing industry (process records, events, images, parameters) is expected to grow exponentially. However, access to data alone is not sufficient. To make the data useful for Root Cause analysis (RCA), it must be analyzed, interpreted and visualized. This paper presents a comparison, evaluation and selection of the type of control cards used to monitor and control the quality performance of manufactured products. It was assumed that the values monitored on the control cards should be able to be represented as sets of non-conforming products (product fractions) in the whole population of manufactured products.
暂无评论