The proceedings contain 28 papers. The topics discussed include: identification of the onset of dementia of older adults in the age of internet of things;applying Internet of things and machine-learning for personaliz...
ISBN:
(纸本)9781728104041
The proceedings contain 28 papers. The topics discussed include: identification of the onset of dementia of older adults in the age of internet of things;applying Internet of things and machine-learning for personalized healthcare: issues and challenges;identification of illegal forum activities inside the dark net;kernel logistic regression: a robust weighting for imbalanced classes with noisy labels;video-based measurement of physiological parameters using peak-to-valley method for minimization of initial dead zone;domain knowledge driven FRBR and cataloguing for the future libraries;a review of strengths and weaknesses of spatiotemporal data analysis techniques;and using electronic health records and machinelearning to make medical-related predictions from non-medical data.
"The 17th international Symposium on Novel and Nano Materials (ISNNM)" was held in Jeju, Korea from 14th to 18th November, 2022, and the proceedings for the session of "Integrated Computer-Aided Process...
详细信息
"The 17th international Symposium on Novel and Nano Materials (ISNNM)" was held in Jeju, Korea from 14th to 18th November, 2022, and the proceedings for the session of "Integrated Computer-Aided Process engineering (ICAPE) were published in Feb, 2023, as a special issue of Materials Transactions (Vol. 64, No. 9). Following the first special issue, which covered the content of the ICAPE session at the international Symposium on Innovation in Materials Processing (ISIMP), this second special issue also presents various topics, including computational materials science, data-driven optimization, as well as experimental validation of optimized process. This article offers a concise overview of several key topics presented at the second special issue, including: macro-scale numerical analysis through finite element methods (FEM), and microstructure simulations using phase-field modelling (PFM), as well as various optimization methods such as machinelearning (ML), artificial intelligence (AI), and design of experiments (DoE).
The proceedings contain 17 papers. The topics discussed include: spark-based machinelearning pipeline construction method;implementation of chinese reader aid for visually-impaired by using neural network and text su...
ISBN:
(纸本)9781728104041
The proceedings contain 17 papers. The topics discussed include: spark-based machinelearning pipeline construction method;implementation of chinese reader aid for visually-impaired by using neural network and text summarization technologies;a new percentage of sales method for forecasting additional funds needed;an artificially intelligent wearable device for dementia patients;development of IoT-based safety management method through an analysis of structural characteristics and risk factors for industrial valves;analysis of machinelearning techniques for credit card fraud detection;social content mining in social networks;using knowledge discovery techniques to support tutoring in an open world intelligent game-based learning environment;and a clustering approach for outliers detection in a big point-of-sales database.
作者:
Meng, XuranCao, YuanZou, DifanUmich
Dept Biostat Ann Arbor MI 48109 USA HKU
Dept Stat & Actuarial Sci Hong Kong 999077 Peoples R China HKU
Inst Data Sci Dept Comp Sci Hong Kong 999077 Peoples R China
Gradient regularization, as described in Barrett and Dherin (in: internationalconference on learning representations, 2021), is a highly effective technique for promoting flat minima during gradient descent. Empirica...
详细信息
Gradient regularization, as described in Barrett and Dherin (in: internationalconference on learning representations, 2021), is a highly effective technique for promoting flat minima during gradient descent. Empirical evidence suggests that this regularization technique can significantly enhance the robustness of deep learning models against noisy perturbations, while also reducing test error. In this paper, we explore the per-example gradient regularization (PEGR) and present a theoretical analysis that demonstrates its effectiveness in improving both test error and robustness against noise perturbations. Specifically, we adopt a signal-noise data model from Cao et al. (Adv Neural Inf Process Syst 35:25237-25250, 2022) and show that PEGR can learn signals effectively while suppressing noise memorization. In contrast, standard gradient descent struggles to distinguish the signal from the noise, leading to suboptimal generalization performance. Our analysis reveals that PEGR penalizes the variance of pattern learning, thus effectively suppressing the memorization of noises from the training data. These findings underscore the importance of variance control in deep learning training and offer useful insights for developing more effective training approaches.
Machining is an important type of manufacturing process. Because machining utilizes a tool to cut raw materials, tool fatigue is the main cause of degradation in productivity and efficiency. Hence, tool wear and remai...
详细信息
Machining is an important type of manufacturing process. Because machining utilizes a tool to cut raw materials, tool fatigue is the main cause of degradation in productivity and efficiency. Hence, tool wear and remaining useful life (RUL), which is related to tool fatigue, should be managed to optimize the machining process. In this study, a framework for tool wear and RUL prediction using monitoring data and machinelearning methods is proposed. First, real-world machining data are collected from sensors in the machines through a data acquisition system. Next, feature engineering, including time series feature extraction and correlation coefficient-based feature selection, is employed to construct a concise set of important features from the raw sensor data. Subsequently, tool wear is predicted using machinelearning-based regression methods. Finally, RUL is predicted using iterative piecewise linear regression as real-time forecasting, which allows adaptation to changes in tool wear progression patterns and decision thresholds that frequently occur in real-world machining process. The experimental results reveal that (1) the proposed method outperformed the benchmark methods, (2) important features were selected using feature engineering, and (3) the proposed method, used without cutting force data, showed comparable results. Furthermore, we examined the flexibility of the proposed method with an example of a safe RUL prediction by manipulating the decision threshold to reduce the probability of tool failure.
machinelearning (ML) and computer vision (CV) play a crucial role in precision agriculture (PA) by enabling data-driven solutions that enhance crop yields, resource efficiency, and sustainability. These technologies ...
详细信息
machinelearning (ML) and computer vision (CV) play a crucial role in precision agriculture (PA) by enabling data-driven solutions that enhance crop yields, resource efficiency, and sustainability. These technologies have transformed PA by addressing complex agricultural challenges through real-time monitoring and predictive analytics. This systematic review aims to investigate the recent advancements in ML and CV methodologies within PA, focusing on crop, harvesting, soil and water management. Guided by the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) methodology, the study analyzes 213 articles from the past five years, providing insights into publication trends, key application area, and the distribution of research efforts across various crops and vision data types. The review highlights the predominant use of Convolutional Neural Networks (CNNs) in PA, particularly in crop detection, disease and pest detection, and weed detection. Emerging deep learning techniques, such as Vision Transformers (ViTs) and Generative Adversarial Networks (GANs), are also explored for their potential in visual data analysis. Findings show that maize is the most frequently studied crop, Red-Green-Blue (RGB) imagery is the primary vision data type and self-collected datasets are preferred over public ones. Key challenges include limited access to high-quality and diverse datasets, and the need for adaptable solutions across diverse agricultural contexts. By summarizing these advancements and challenges, this review provides direction for future research and highlights opportunities to expand ML and CV applications in sustainable and efficient agricultural practices.
Universitat Polit`ecnica de Val`encia (UPV) faces challenges in managing its Alfresco document repository, which contains 600,000 PDF files, of which only 100,000 are correctly categorised. Manual classification is la...
详细信息
ISBN:
(纸本)9783031777301;9783031777318
Universitat Polit`ecnica de Val`encia (UPV) faces challenges in managing its Alfresco document repository, which contains 600,000 PDF files, of which only 100,000 are correctly categorised. Manual classification is laborious and error-prone, hindering information retrieval and advanced search capabilities. This project presents an automated pipeline that integrates optical character recognition (OCR) and machinelearning to efficiently classify documents. Our approach distinguishes between scanned and digital documents, accurately extracts text and categorises it into 51 predefined categories using models such as BERT and RF. By improving document organisation and accessibility, this work optimises UPV's document management and paves the way for advanced search technologies and real-time classification systems.
This study investigates the use of machinelearning models to predict surface roughness (Ra) in milling multi-grade aluminum alloys without prior knowledge of optimal cutting parameters. A diverse milling dataset enco...
详细信息
This study investigates the use of machinelearning models to predict surface roughness (Ra) in milling multi-grade aluminum alloys without prior knowledge of optimal cutting parameters. A diverse milling dataset encompassing material properties and cutting parameters from various aluminum alloy grades was compiled from research articles. Four machinelearning algorithms, Extreme Gradient Boosting (XGB), Random Forest (RFR), Catalogical Gradient Boosting (CAT), and Gradient Boosting Regression (GBR), were employed to develop the predictive model. The dataset underwent cleaning, imputation, and outlier removal to ensure data quality. Feature engineering incorporated material properties and cutting parameters for model training. Performance metrics such as RMSE, MAPE, and R2 were used to assess the models' accuracy. The SHapley Additive exPlanations (SHAP) technique was employed to interpret the models and identify influential features. GBR achieved the highest prediction accuracy with an RMSE of 0.2507 mu m, MAPE of 23.36%, and R2 of 0.8709. Thermal conductivity, feed rate, and cutting speed were consistently identified as the most influential factors, although their rankings differed slightly. This study successfully developed a GBR model for effective Ra prediction in aluminum alloy milling, supporting advancements in smart manufacturing by enabling accurate surface quality prediction and data-driven process optimization through machinelearning.
Quality of data and complexity of decision boundaries in high-dimensional data streams that are collected from cyber-physical power systems can greatly influence the process of learning from data and diagnosing faults...
详细信息
Quality of data and complexity of decision boundaries in high-dimensional data streams that are collected from cyber-physical power systems can greatly influence the process of learning from data and diagnosing faults in such critical systems. These systems generate massive amounts of data that overburden the system with excessive computational costs. Another issue is the presence of noise in recorded measurements that poses a challenge to the learning process, leading to a degradation in the performance of fault diagnosis. Furthermore, the diagnostic model is often provided with a mixture of redundant measurements that may deviate it from learning normal and fault distributions. This paper presents the effect of feature engineering on mitigating the aforementioned challenges in learning from data streams collected from cyber-physical systems. A data-driven fault diagnosis framework for a 118-bus power system is constructed by integrating feature selection, dimensionality reduction methods, and decision models. A comparative study is enabled accordingly to compare several advanced techniques in both domains. Dimensionality reduction and feature selection methods are compared both jointly and separately. Finally, experiments are concluded, and a setting is suggested that enhances data quality for fault diagnosis.
Bariatric surgery has emerged as an effective treatment option for individuals with severe obesity, offering not only weight loss but also remarkable improvements in metabolic health and endocrine function. Efficient ...
详细信息
ISBN:
(纸本)9783031803543;9783031803550
Bariatric surgery has emerged as an effective treatment option for individuals with severe obesity, offering not only weight loss but also remarkable improvements in metabolic health and endocrine function. Efficient management of the patient's length of stay (LOS) in the hospital is critical to optimizing healthcare resources and ensuring patient well-being. The objective of this study was to analyze post-operative LOS following bariatric surgery using machinelearning (ML) algorithms and determine their predictive performance. data from 757 patients undergoing bariatric surgery from 2019 to 2022 in a single institution were collected and analyzed. The ML algorithms used included Decision Tree (DT), Random Forest (RF), and Gradient Boosted Trees (GBT). The results showed that RF and GBT had comparable accuracy (71.7% and 71.1% respectively) and outperformed DT (62.0%). RF showed better overall performance, while GBT showed higher precision for predicting shorter LOS (less than 5 days). The results highlight the potential of machinelearning algorithms in predicting post-operative LOS, aiding in healthcare resource allocation and personalized patient care.
暂无评论