In addition to traditional clinical research, advances in information communication technologies facilitates new medical research using internet of things devices and other cutting-edge technologies. Such medical rese...
详细信息
In addition to traditional clinical research, advances in information communication technologies facilitates new medical research using internet of things devices and other cutting-edge technologies. Such medical research also simplifies the collection of data on research subjects in their daily lives internationally. In this context, medical research is increasingly required to comply with rules protecting patients' personal data. This study proposes a model to enable researchers and other stakeholders including ethics committees in such international medical research to easily verify whether the planned processing of patient data complies with relevant legal and ethical rules. The model proposed in this study consists of (1) how patient information is processed, (2) the rules that are relevant to the processing, and (3) the analysis of whether the processing complies with the rules. This study suggests that the model should describe the aspects of dataprocessing that are subject to many rules, such as the location of the processing, categories of data, purposes of the processing, and the storage period. Thus, using the information described in the model as a guide, stakeholders can determine which national and international legal/ethical rules apply to the planned processing. Then, they can use the model to verify and document whether the processing complies with the specific regulatory rules. The use of the model in this study enables stakeholders in medical research to comply with the rules related to patient data more effectively than without using the model.
As the Internet of Vehicles (IoV) becomes flourishing and the data generated by sensors be ubiquitous, there exist various kinds of IoV applications with different performance requirements. Hence, different distribute...
详细信息
ISBN:
(纸本)9781538637906
As the Internet of Vehicles (IoV) becomes flourishing and the data generated by sensors be ubiquitous, there exist various kinds of IoV applications with different performance requirements. Hence, different distributed dataprocessing systems (DDPS) clusters will coexist, e.g., a stream processing system cluster for real-time tasks and a batch one for statistics based data mining tasks, to meet the requirements of such IoV applications. However, it is not an economical or convenient way to maintain varied systems clusters, as the developers and/or administrators have to be familiar with all of these DDPSs, and of course, the deployment of multiple DDPS means a waste of resources compared to the deployment of one DDPS. Based on these observations, this paper proposes the TDAG as a solution. TDAG allows users to adjust the dataprocessing from the streaming style to the batch style by encapsulating the input data with specific packing strategies. We have implemented TDAG in a prototype called TStream. The experimental tests show that our TStream is both effective and efficient.
Big data management systems are in demand today in almost all industries, being also a foundation for artificial intelligence training. The use of heterogeneous polystores in big data systems has led to the fact that ...
详细信息
Big data management systems are in demand today in almost all industries, being also a foundation for artificial intelligence training. The use of heterogeneous polystores in big data systems has led to the fact that tools within the same system have different data granularity and access control models. The harmonization of these components by the security administrator and the implementation of a common access policy are now carried out by hand. This leads to an increasing number of vulnerabilities, which in turn become frequent causes of data leaks. The current situation in the field of automation and analysis of access control in big data systems reveals the lack of automation solutions for polystore-based systems. This paper addresses the problem of automated access control analysis in big data management systems. We formulate and discuss the main contradiction between the requirement of scalability and flexibility of access control and the increased workload on the security administrator, aggravated by the use of different data and access control models in system components. To solve this problem, we propose a new automated method for analyzing security policies based on a graph model, which reduces the number of potential vulnerabilities caused by incorrect management of big data systems. The proposed method uses the data lifecycle model of the system, its current settings, and the required security policy. The use of two-pass analysis (from data sources to data receivers and back) allows us to solve two problems: the analysis of the access control system for potential vulnerabilities and the check for business logic vulnerabilities. As an example, we consider the use of a developed prototype tool for security policy analysis in a big data management system.
Ecological vulnerability refers to the degree of ecosystem disturbance,system damage,and the ability of system *** case-specific evaluations of ecological vulnerability are progressing rapidly,they have been carried o...
详细信息
Ecological vulnerability refers to the degree of ecosystem disturbance,system damage,and the ability of system *** case-specific evaluations of ecological vulnerability are progressing rapidly,they have been carried out mainly in areas with intensive human activities or in harsh natural *** the Web of ScienceTM core collection,this review paper summarized studies on ecological vulnerability published from 2000 to 2022 and analyzed in depth major case *** was found that traditional ecological vulnerability research has been addressed largely in terms of assessment models,data processing models,and analysis of influencing factors,however there was a lack of research on the process of vulnerability *** the vulnerability transformation in a hierarchical vulnerability index system is regulated by multiple factors in a heterogeneous region,it is urgent to understand how the ecological vulnerability in a region evolves from one level to another over *** this paper put forward the new perspective of research,i.e.,applying quantitative analysis to identification of regulating factors and exploring the mechanisms of ecological vulnerability *** new perspective could assist in monitoring the complex spatiotemporal changes in ecological vulnerability and taking necessary measures to prevent from decline of ecological stability.
As the development of Internet of things (IoT), more and more devices have been connected into the network. Massive and various devices have generated large kinds of applications, however they also bring new challenge...
详细信息
ISBN:
(纸本)9781467329644;9781467329637
As the development of Internet of things (IoT), more and more devices have been connected into the network. Massive and various devices have generated large kinds of applications, however they also bring new challenges for the network maintenance and management. One of the significant differences between Internet and Internet of things is that, the embedded devices are a major category of devices in the IoT, which are with single function, poor performance and in huge quantities. When divided by their functions, they are all monitoring devices that are distributed in the network and collecting data all the time. Large amount of data that are generated in real time should be transmitted to the collecting and computing devices the fast the better. How to transmit the data efficiently and how to meet the needs of variety of different applications are the new challenges arisen. In this paper, the data transmission characteristics and different applications' requirements in the IoT are detailed analyzed. Further more, a new data processing model, which is a message oriented middleware data processing model, is proposed. With the new model, data transmission and processing will be more convenient, efficient, easy to find, easy to share and secure.
A new type of data processing model for a ball-end milling process based on biharmonic spline interpolation (BSI) is presented for the first time. The model aims to solve the problem of determining the discrete point ...
详细信息
A new type of data processing model for a ball-end milling process based on biharmonic spline interpolation (BSI) is presented for the first time. The model aims to solve the problem of determining the discrete point number in the conventional machining topography simulation method. The kinematic model of the ball-end milling is established based on the consideration of the tool vibration, and the sweeping point cloud of the cutting edge is generated and extracted based on the servo bounding box. Moreover, the BSI method is introduced to reconstruct the machined surface. In view of the scarcity of prior research studies on the topographical prediction for free-form workpiece surfaces, this study evaluates the prediction model of surface topography of free-form surface milling and extends the application of BSI in the point cloud. The computation error that is caused by the large-scale use of the analytic nonlinear equations in other simulation processes is significantly reduced, and the complex optimization operators that ensure that each microelement of the cutting edge sweeps no more than one grid point within a unit time step are avoided. Correspondingly, this method becomes superior compared to the conventional approximation-processing method of discrete datasuch as IMAGEWAREthereby simplifying data manipulation, improving the calculation accuracy by 7.6%, and reducing the large computation time for the surface topography simulation by linearly expanding the valid data. The effectiveness and accuracy of this algorithm are demonstrated by comparing some computer simulation results with the machining results at different experimental conditions, and some topography evaluation parameters are also analyzed in this study. The shape error (S-ba) and surface roughness (S-a) values of the simulated topographies are compared with those of the measured topographies (error within 15.9%).
With the continuous expansion of the data stream applications,the data stream frequent pattern mining is becoming a hot research topic in the field of data mining,the domestic and foreign scholars put forward a large ...
详细信息
ISBN:
(纸本)9781509036202
With the continuous expansion of the data stream applications,the data stream frequent pattern mining is becoming a hot research topic in the field of data mining,the domestic and foreign scholars put forward a large number of data stream frequent itemsets mining *** paper improves the related definitions of frequent itemsets and sliding windows,classifies sliding windows from data processing models,analyzes the use of sliding windows in typical frequent itemsets mining algorithms,and summarizes the mining techniques and efficiency of typical frequent itemsets mining algorithms.
The characteristics of natural radio communications and the disturbances of application environment inevitably lead to the unreliable phenomena of duplicated readings, false positive reading and false negative reading...
详细信息
ISBN:
(纸本)9781424473281
The characteristics of natural radio communications and the disturbances of application environment inevitably lead to the unreliable phenomena of duplicated readings, false positive reading and false negative reading when RFID is applied to logistics tracking system, and numerous unreliable data are generated. To solve this problem, a series of evaluation indexes for evaluating RFID application reliability were put forward, and a layered data processing model was established to improve RFID application reliability, including primitive event processing layer and complex event processing layer. The former receives primitive reading events from the reader-network, cleans redundant events, and thus tidy logic reading events are obtained. And the complex event processing layer detects and corrects false positive reading and false negative reading through setting up application integrity constraint. The results of application show that the model can guarantee the correctness and integrality of the data, and improve RFID application reliability.
The next era of big data, computer information processing technology for processing large data stream processingmodel and the model is divided into two types of batch mode, computer information processing large data ...
详细信息
The next era of big data, computer information processing technology for processing large data stream processingmodel and the model is divided into two types of batch mode, computer information processing large data including extraction, analysis and interpretation of data in the three big workFacebook and dataprocessing, for example, analyzing the application of computer information processing technology, challenges summarize large data computer information processing technology face.
Nowadays discrete manufacturing enterprises suffer from a bottleneck of capturing and collection of shop-floor production data. Bar code system usually leads to a fact that the production data are inaccurate. In such ...
详细信息
ISBN:
(纸本)9783037858967
Nowadays discrete manufacturing enterprises suffer from a bottleneck of capturing and collection of shop-floor production data. Bar code system usually leads to a fact that the production data are inaccurate. In such a situation, radio frequency identification (RFID) technology is introduced into discrete manufacturing enterprises for tracking the real-time production process. In this research, a RFID-based enterprise system architecture is presented. Under this architecture, according to the key issue in RFID application, a data processing model is proposed to manage the RFID raw data and transform them into enterprise level information. Moreover, a multi-rules based data filter is constructed to support the data processing model. And the experimental results show that the filter can effectively process and filter the huge amount of RFID data by using of multi-filtering rules.
暂无评论