The development of the amount of data stored through online-based storage systems in cloud computing systems is very large, so there is a possibility that there will be problems in processing such huge data. Therefore...
详细信息
ISBN:
(纸本)9781728124766
The development of the amount of data stored through online-based storage systems in cloud computing systems is very large, so there is a possibility that there will be problems in processing such huge data. Therefore in the conference part of the results of this study of course presents an opinion about the new method model that researchers developed is to analyze the data using an integrated analysis model and in measuring the accuracy of large data of course on this occasion using the hierarchy of grid partition (HGP) and as a tool for design that uses the Unified Modeling Language (UML). The big data measurement model in the framework reaches the maximum of the needs of the measuring object entity or which will praise the accuracy of the data. Therefore, there needs to be a further process to support the results of this method model in order to get accurate test results. The working pattern of this big data measurement method model is, of course, agreed on the integration of data stored through the Cloud Server, or Database Management System Server.
This works presents an innovative application of Markov Decision Process (MDP) to a medium-term mining logistics planning problem considering the mine-to-client supply chain. We implemented three distinct algorithms b...
详细信息
ISBN:
(数字)9783907144022
ISBN:
(纸本)9781728188133
This works presents an innovative application of Markov Decision Process (MDP) to a medium-term mining logistics planning problem considering the mine-to-client supply chain. We implemented three distinct algorithms based on state-of-the-art approaches to solve large-scale problems, and compared their results. Furthermore, we combined all three variants in a single novel algorithm that attained fast convergence and may be an alternative to circumvent the curse of dimensionality underlying large scale problems.
The data mining process requires a data set that can be used in determining a number of specific patterns to gain new knowledge. Large data sets (Big data) require special methods to get effective results. Included in...
The data mining process requires a data set that can be used in determining a number of specific patterns to gain new knowledge. Large data sets (Big data) require special methods to get effective results. Included in this study are using big data related to the earthquake in Lombok. Earthquake research, especially in Lombok, is needed because Lombok is on three active plates in Indonesia, so that the danger of earthquake damage can be minimized. Earthquake data obtained from the Geophysical Station (BMKG) of Mataram has different characteristics and is complex, an appropriate method is needed, namely by detecting a non-parametric method with the Multivariate Adaptive Regression Spline (MARS). The use of the backward stepwise algorithm with the Conical Quadratic programming (CQP) framework of MARS, referred to as CMARS (Conic Multivariate Adaptive Regression Splines), is used for optimizing the results. The conclusions of this study are 1. A mathematical model with a total of 12 basis functions (BF) has contributed to the prediction analysis of the PGA dependent variable. 2. Contributions of the influence of independent variables on the PGA value are the epicenter distance (Repi) of 100% and the Magnitude (Mw) of 31.08608%, while the temperature of the incident location (SUHU) of 5.48525% and depth (Depth) of 3,52988%. 3. Acquired areas that have earthquake hazard levels in the order of the most vulnerable are Malaka, Genggelang, Tegal Maja, Senggigi and Mangsit.
Cumulonimbus clouds is one of the early formations that leads to a small scale tornado known locally in Indonesia as waterspout. The clouds cumulonimbus shows some irregular and undefined phenomena called chaos. Howev...
详细信息
ISBN:
(纸本)9781728124766
Cumulonimbus clouds is one of the early formations that leads to a small scale tornado known locally in Indonesia as waterspout. The clouds cumulonimbus shows some irregular and undefined phenomena called chaos. However, in order to make sure that a chaos phenomenon will lead to the occurance of tornadoes, some criteria and characteristics should be fulfilled. One of such criteria is an extreme heat that is caused by the formation of clouds cumulonimbus. On the gray color edge of the cumulonimbus clouds there are some conditions that characterize the image such as wavelength, frequency and the intensity of the colors in clouds cumulonimbus images, as well as the different levels of the brightness and darkness. In addition, color of the image could be used also as a basis to define the parameters related to the start condition of the tornado. In this research, edge detection algorithm is used to get the gray edge boundary, with mean to extract the intensification of irregular patterns on cumulonimbus clouds in order to automatically predict the occurrence of tornadoes. The results of this research show that the edge detection approach is a promising technique for prediction of tornado occurrence.
In spite of clinical notes in Electronic Health Records (EHR) providing abundant information about patient health, effective modeling of clinical notes remains in its infancy. A patient's clinical notes correspond...
详细信息
ISBN:
(数字)9781728108582
ISBN:
(纸本)9781728108599
In spite of clinical notes in Electronic Health Records (EHR) providing abundant information about patient health, effective modeling of clinical notes remains in its infancy. A patient's clinical notes correspond to a sequence of free-form texts generated by health care professionals over time; with each note in turn containing a sequence of words. Additionally, notes are accompanied by external attributes at multiple layers such as the time at which each note was created (note level) or the demographics of the patient (patient level). Thus, EHR notes correspond to a nested structure of text sequences augmented with external multi-layer attributes. To model this complex problem, we propose an Attributed Hierarchical Attention model, named HAC-RNN, that integrates multiple RNN layers that encode nested sequential notes with contextual and temporal attention layers that are conditioned on the external attributes. While the bottom layer of HAC-RNN is responsible for contextual summarization of the note content, the top layer combs through the entire timeline of notes to focus on those which are most relevant. These attention layers, which are each conditioned on layer-specific hierarchical attributes, allow personalized predictions through inferring patient *** evaluate HAC-RNN using three real-world medical tasks, detecting in-hospital acquired infections and predicting patient mortality using critical care database MIMIC-III. Our results demonstrate that our model significantly outperforms state-of-the-art techniques for all tasks.
This study developed an application for official letter management in Sukasada 1 Public High School. The development of the application aimed to contribute to government programs in improving public services, especial...
This study developed an application for official letter management in Sukasada 1 Public High School. The development of the application aimed to contribute to government programs in improving public services, especially in correspondence. In the letter classification process, the researcher implemented the TF-IDF method whose working procedures included conversion, Optical Character Recognition (OCR), filtering, tokenizing, and classification. Conversion is to convert official mail to digital form (jpg). OCR is to get the text of the letter then carries out the official letter in the form of an image and then filtering and tokenizing are carried out. Classification is the process of grouping a letter into its category by calculating the cosine similarity value between the letter being tested and the letter in the system. To test the accuracy of the classification results, a confusion matrix is used. The results of the study are in the form of web-based and mobile-based applications. Operators and admins used the web application to manage official letters at SMAN 1 Sukasada. The mobile application served to facilitate the principal in accessing mail data from a smartphone. The system was able to group letters with 78% accuracy (good), precision 77% and recall 77%.
The falsification and embezzlement of personal data are still found in several cases in the past year. The reason is personal data in physical form are easily manipulated and difficult to be distinguished from the ori...
The falsification and embezzlement of personal data are still found in several cases in the past year. The reason is personal data in physical form are easily manipulated and difficult to be distinguished from the original. The most detrimental impact is if a person's personal data is used for credit application fraud in the banking industry. Implementation of blockchain technologies, one of which is Ethereum, allows the use of contracts as a rule that must be fulfilled by the parties involved. All stored transactions are perpetual (cannot be deleted or changed), easy to be audited, transparent, and distributed at each participating node. This study aimed to develop a smart contract for personal data transactions with a case study of credit submission at the Bank. The authors developed a trial application using the prototype method in the process of assessment. Assessment was done by black box testing method in the scope of the lab with 10 credit submission data to be transacted. It is resulting in all credit submission data can be transacted and stored in the smart contract. The interview resulted in an opinion that blockchain technology can be used to store personal data and submit credit submissions at the Bank. The results of the analysis on assessments and interviews conclude that blockchain technology can be used as a medium to store personal data and secure credit applications. For future research, testing transactions can be done on Testnet networks with changing several blocks of data.
The aim purposed of this research is to evaluate clinical management information system called (E-CLINIC) which is integrated with the Primary Care (Pcare) application provided by Indonesia sosial healt insurance orga...
详细信息
ISBN:
(纸本)9781728134376
The aim purposed of this research is to evaluate clinical management information system called (E-CLINIC) which is integrated with the Primary Care (Pcare) application provided by Indonesia sosial healt insurance organization called BPJS Kesehatan to first level of health cares facility called as FKTP. This evaluation is intended to measure the maturity level on used of two integrated applications using COBIT 4.1 framework. The integration of the Clinic's management information system with PCare is considered to be the main resource that has strategic value to be able to manage information effectively and efficiently for the achievement of organizational goals. The results achieved have been running in accordance with the organization's business objectives, this can be seen from the level of maturity that reaches level 3 (Defined), which is a condition in which the organization has formal and written standard procedures that have been socialized to all levels of management and employees to be obeyed and run in daily activities.
Foreground segmentation is one of moving object detection techniques of computer vision applications. To date, modern moving object detection methods require complex background modeling and thresholds tuning to confro...
详细信息
E-Report is developed based on the web to make it easier in preparing students' assessment reports in the form of rating reports, competency achievement reports, and ledgers. Even, many high schools/vocational sch...
E-Report is developed based on the web to make it easier in preparing students' assessment reports in the form of rating reports, competency achievement reports, and ledgers. Even, many high schools/vocational schools in Bali have not implemented the E-Report. Implementation of cognitive walkthrough techniques, heuristic evaluation, and users' experience questionnaire in this research aimed to find out the results of evaluating user experience from the aspects of effectiveness, efficiency, user satisfaction, and recommendations for improvement. Effectiveness and efficiency were calculated by Cognitive Walkthrough techniques (CW), user satisfaction by user experience questionnaire techniques (UEQ) and recommendations for improvement, obtained from experts who discuss aspects of usability with Heuristic Evaluation techniques and Cognitive Walkthrough data. Based on the results obtained in terms of users, E-Report has been ineffective and inefficient. While in terms of user satisfaction, respondents were satisfied with Applications E-Report. This is because the effectiveness of a task that has a very large error and failure in one of the features as well as the clarity of results that poor user satisfaction can affect processing time. Therefore, evaluation of user experience based on aspects of usability in the E-Report has not met the criteria for products with good usability. Improvements made on the start page, page values, page value results were done through a photo frame. Therefore, identifying the problems from these three sources can further optimize the usability and improve user experience on E-Report.
暂无评论