The impact of infectious disease on human populations is a function of many factors including environmental conditions, vector dynamics, transmission mechanics, social and cultural behaviors, and public policy. A comp...
详细信息
The impact of infectious disease on human populations is a function of many factors including environmental conditions, vector dynamics, transmission mechanics, social and cultural behaviors, and public policy. A comprehensive framework for disease management must fully connect the complete disease lifecycle, including emergence from reservoir populations, zoonotic vector transmission, and impact on human societies. The Framework for Infectious Disease Analysis is a software environment and conceptual architecture for data integration, situational awareness, visualization, prediction, and intervention assessment. Framework for Infectious Disease Analysis automatically collects biosurveillance data using natural language processing, integrates structured and unstructured data from multiple sources, applies advanced machine learning, and uses multi-modeling for analyzing disease dynamics and testing interventions in complex, heterogeneous populations. In the illustrative case studies, natural language processing from social media, news feeds, and websites was used for information extraction, biosurveillance, and situation awareness. Classification machine learning algorithms (support vector machines, random forests, and boosting) were used for disease predictions.
Projects delays are among the most pressing challenges faced by the construction sector attributed to the sector's complexity and its inherent delay risk sources' interdependence. Machine learning offers an id...
详细信息
Projects delays are among the most pressing challenges faced by the construction sector attributed to the sector's complexity and its inherent delay risk sources' interdependence. Machine learning offers an ideal set of techniques capable of tackling such complex systems;however, adopting such techniques within the construction sector remains at an early stage. The goal of this study was to identify and develop machine learning models in order to facilitate accurate project delay risk analysis and prediction using objective data sources. As such, relevant delay risk sources and factors were first identified, and a multivariate data set of previous projects' time performance and delay-inducing risk sources was then compiled. Subsequently, the complexity and interdependence of the system was uncovered through an exploratory data analysis. Accordingly, two suitable machine learning models, utilizing decision tree and naive Bayesian classification algorithms, were identified and trained using the data set for predicting project delay extents. Finally, the predictive performances of both models were evaluated through cross validation tests, and the models were further compared using machine-learning-relevant performance indices. The evaluation results indicated that the naive Bayesian model provides a better predictive performance for the data set examined. Ultimately, the work presented herein harnesses the power of machine learning to facilitate evidence-based decision making, while inherent risk factors are active, interdependent, and dynamic, thus empowering proactive project risk management strategies.
Hospitality venues traditionally use historical data from customers for their customer relationship management systems, but now they can also collect real-time data and automated procedures to make dynamic decisions a...
详细信息
Hospitality venues traditionally use historical data from customers for their customer relationship management systems, but now they can also collect real-time data and automated procedures to make dynamic decisions and predictions about customer behavior. Machine learning is an example of automated processes that create insights into cocreation of value through dynamic customer engagement. To show the merits of automation, machine learning was implemented at a major hospitality venue and compared with traditional methods to identify what customers value in a loyalty program. The results show that machine learning processes are superior in identifying customers who find value in specific promotions. This research deepens practical and theoretical understanding of machine learning in the customer engagement-to-value loyalty chain and in the customer engagement construct that uses a dynamic customer engagement model.
An efficient and rapid workflow is presented to estimate the recovery performance of an existing vertical-well, pattern-based waterflood recovery design using knowledge management and reservoir engineering in a collab...
详细信息
An efficient and rapid workflow is presented to estimate the recovery performance of an existing vertical-well, pattern-based waterflood recovery design using knowledge management and reservoir engineering in a collaborative manner. The knowledge management tool is used to gather production data and calculate pattern-based recoveries and injection volumes by defining pattern boundaries and allocating annual well injection/production volumes in a systematic manner. Classical reservoir engineering forecasting methods, namely, a combination of oil cut versus cumulative recovery performance curves, and decline curve analyses are applied to forecast the performance of the waterflood pattern of interest. Extrapolating established trends of oil cut vs. recovery for each pattern quantified future performance assessments. Time is attached to the performance by introducing liquid rate constraints. Forecasting using both constant and declining liquid rates differentiated the impact of deteriorating reservoir pressure and oil-cut trends on individual pattern oil rate forecasts thus defining current efficiency of each pattern.
Donor selection for Hematopoietic Stem Cell Transplant often requires physicians to manually select 3 to 5 donors from a list of 100s of genetically compatible donors as identified by HLA-based matching algorithms. Th...
详细信息
ISBN:
(纸本)9781509002870
Donor selection for Hematopoietic Stem Cell Transplant often requires physicians to manually select 3 to 5 donors from a list of 100s of genetically compatible donors as identified by HLA-based matching algorithms. The decision process is complicated by a lack of strict guidelines governing a "secondary" selection process, which is based upon non-HLA donor attributes. Our research is aimed at modeling this "secondary" decision process which can help physicians choose the right donors, based on donor attributes and historical choice behavior. Proposed black box models will help in improving selection consistency.
There is a growing interest in data-analytic modeling for prediction and/or detection of epileptic seizures from EEG recording of brain activity [1-10]. Even though there is clear evidence that many patients have chan...
详细信息
ISBN:
(纸本)9781479919598
There is a growing interest in data-analytic modeling for prediction and/or detection of epileptic seizures from EEG recording of brain activity [1-10]. Even though there is clear evidence that many patients have changes in EEG signal prior to seizures, development of robust seizure prediction methods remains elusive [1]. We argue that the main issue for development of effective EEG-based predictive models is an apparent disconnect between clinical considerations and data-analytic modeling assumptions. We present an SVM-based system for seizure prediction, where design choices and performance metrics are clearly related to clinical objectives and constraints. This system achieves very accurate prediction of preictal and interictal EEG segments in dogs with naturally occurring epilepsy. However, our empirical results suggest that good prediction performance may be possible only if the training data set has sufficiently many preictal segments, i.e. at least 6-7 seizure episodes.
Integrated Circuit (IC) technology is changing in multiple ways: 193i to EUV exposure, planar to non-planar device architecture, from single exposure lithography to multiple exposure and DSA patterning etc. Critical d...
详细信息
ISBN:
(纸本)9780819499738
Integrated Circuit (IC) technology is changing in multiple ways: 193i to EUV exposure, planar to non-planar device architecture, from single exposure lithography to multiple exposure and DSA patterning etc. Critical dimension (CD) control requirement is becoming stringent and more exhaustive: CD and process window are shrinking., three sigma CD control of < 2 nm is required in complex geometries, and metrology uncertainty of < 0.2 nm is required to achieve the target CD control for advanced IC nodes (e.g. 14 nm, 10 nm and 7 nm nodes). There are fundamental capability and accuracy limits in all the metrology techniques that are detrimental to the success of advanced IC nodes. Reference or physical CD metrology is provided by CD-AFM, and TEM while workhorse metrology is provided by CD-SEM, Scatterometry, Model Based Infrared Reflectrometry (MBIR). Precision alone is not sufficient moving forward. No single technique is sufficient to ensure the required accuracy of patterning. The accuracy of CD-AFM is similar to 1 nm and precision in TEM is poor due to limited statistics. CD-SEM, scatterometry and MBIR need to be calibrated by reference measurements for ensuring the accuracy of patterned CDs and patterning models. There is a dire need of measurement with < 0.5 nm accuracy and the industry currently does not have that capability with inline measurments. Being aware of the capability gaps for various metrology techniques, we have employed data processing techniques and predictive data analytics, along with patterning simulation and metrology models, and data integration techniques to selected applications demonstrating the potential solution and practicality of such an approach to enhance CD metrology accuracy. data from multiple metrology techniques has been analyzed in multiple ways to extract information with associated uncertainties and integrated to extract the useful and more accurate CD and profile information of the structures. This paper presents the optimizati
暂无评论