Explainable AI (XAI) has revolutionized the field of deep learning by empowering users to have more trust in neural network models. The field of XAI allows users to probe the inner workings of these algorithms to eluc...
详细信息
ISBN:
(数字)9798350349399
ISBN:
(纸本)9798350349405
Explainable AI (XAI) has revolutionized the field of deep learning by empowering users to have more trust in neural network models. The field of XAI allows users to probe the inner workings of these algorithms to elucidate their decision-making processes. The rise in popularity of XAI has led to the advent of different strategies to produce explanations, all of which only occasionally agree. Thus several objective evaluation metrics have been devised to decide which of these modules give the best explanation for specific scenarios. The goal of the paper is twofold: (i) we employ the notions of necessity and sufficiency from causal literature to come up with a novel explanatory technique called SHifted Adversaries using Pixel Elimination(SHAPE) which satisfies all the theoretical and mathematical criteria of being a valid explanation, (ii) we show that SHAPE is, infact, an adversarial explanation that fools causal metrics that are employed to measure the robustness and reliability of popular importance based visual XAI methods. Our analysis shows that SHAPE outperforms popular explanatory techniques like GradCAM and GradCAM++ in these tests and is comparable to RISE, raising questions about the sanity of these metrics and the need for human involvement for an overall better evaluation.
Background Precise estimation of current and future comorbidities of patients with cardiovascular disease is an important factor in prioritizing continuous physiological monitoring and new *** learning(ML)models have ...
详细信息
Background Precise estimation of current and future comorbidities of patients with cardiovascular disease is an important factor in prioritizing continuous physiological monitoring and new *** learning(ML)models have shown satisfactory performance in short-term mortality prediction in patients with heart disease,whereas their utility in long-term predictions is *** study aimed to investigate the performance of tree-based ML models on long-term mortality prediction and effect of two recently introduced biomarkers on long-term *** This study used publicly available data from the Collaboration center of Health Information Appli-cation at the Ministry of Health and Welfare,Taiwan,*** collected data were from patients admitted to the cardiac care unit for acute myocardial infarction(AMI)between November 2003 and September *** collected and analyzed mortality data up to December *** records were used to gather demo-graphic and clinical data,including age,gender,body mass index,percutaneous coronary intervention status,and comorbidities such as hypertension,dyslipidemia,ST-segment elevation myocardial infarction,and non-ST-segment elevation myocardial *** the data,collected from 139 patients with AMI,from medical and demographic records as well as two recently introduced biomarkers,brachial pre-ejection period(bPEP)and brachial ejection time(bET),we investigated the performance of advanced ensemble tree-based ML algorithms(random forest,AdaBoost,and XGBoost)to predict all-cause mortality within 14 years.A nested cross-validation was performed to evaluate and compare the performance of our developed models precisely with that of the conventional logistic regression(LR)as the baseline *** The developed ML models achieved significantly better performance compared to the baseline LR(C-Statistic,0.80 for random forest,0.79 for AdaBoost,and 0.78 for XGBoost,vs.0.77 for LR)(PRF<0.001,PAdaBoost<0.001,a
In smart transportation, intelligent systems avoid potential collisions by predicting the intent of traffic agents, especially pedestrians. Pedestrian intent, defined as future action, e.g., start crossing, can be dep...
详细信息
作者:
Benkert, RyanPrabhushankar, MohitAlRegib, GhassanOLIVES
The Center for Signal and Information Processing School of Electrical and Computer Engineering Georgia Institute of Technology AtlantaGA30332-0250 United States
This paper considers deep out-of-distribution active learning. In practice, fully trained neural networks interact randomly with out-of-distribution (OOD) inputs and map aberrant samples randomly within the model repr...
详细信息
Explainable AI (XAI) has revolutionized the field of deep learning by empowering users to have more trust in neural network models. The field of XAI allows users to probe the inner workings of these algorithms to eluc...
详细信息
The challenges posed by nonlinearities in industrial systems necessitate innovative techniques that outperform the limitations of traditional methods such as principal component analysis (PCA). While Kernel Principal ...
详细信息
ISBN:
(数字)9798331513733
ISBN:
(纸本)9798331513740
The challenges posed by nonlinearities in industrial systems necessitate innovative techniques that outperform the limitations of traditional methods such as principal component analysis (PCA). While Kernel Principal Component Analysis (KPCA) offers a robust solution to handle nonlinear data, its computational requirements are a significant issue, especially for large-sized datasets. In this work, we propose a novel technique, namely, reduced kernel principal component analysis-based spectral clustering (RKPCA SpC ), to monitor and detect faults in the benchmark Tennessee Eastman process. The suggested approach addresses the complexity associated with KPCA by reducing data size during the model training phase. This reduction involves retaining only the principal components, preserving informative features, and selecting pertinent samples without compromising the original data's content. The efficacy of the proposed method is evaluated through key performance metrics, including false alarm rate (FAR), missed detection rate (MDR), detection time delay (DTD), and computation time (CT). Additionally, gained execution time (GET), gained storage space (GSP), and loss function (LF) are considered, providing a comprehensive assessment of the developed paradigms' effectiveness. The results demonstrate the promising capabilities of our proposed scheme.
Daily Activity Recordings for artificial intelligence (DARai, pronounced /Dahr-ree/), is a multimodal, hierarchically annotated dataset constructed to understand human activities in real-world settings. DARai consists...
详细信息
The modern energy landscape is undergoing a seismic change from traditional, finite energy sources and toward cleaner, renewable alternatives. The restrictions faced by traditional sources, which are not only finite b...
详细信息
ISBN:
(数字)9798331513733
ISBN:
(纸本)9798331513740
The modern energy landscape is undergoing a seismic change from traditional, finite energy sources and toward cleaner, renewable alternatives. The restrictions faced by traditional sources, which are not only finite but increasingly shrink in the face of burgeoning global energy demands driven by population increase and industrial expansion, are driving this change. Although promising, renewable energy poses complications, particularly the reliance on climatic conditions. An important aspect of addressing these difficulties is effective energy management within distribution systems, which includes forecasting and optimization phases. This research focuses on forecasting using an advanced machine learning (ML) approach. Accurately forecasting renewable energy generation over time is critical for improving energy management. This technique is evaluated using a variety of performance indicators, including Mean Error (ME), Mean Absolute Error (MAE), Root Mean Square Error (RMSE), and the Coefficient of Determination (R 2 ). Empirical studies support the method's usefulness, demonstrating noteworthy performance with low error rates.
In this paper, we introduce the FOCAL (Ford-OLIVES Collaboration on Active Learning) dataset which enables the study of the impact of annotation-cost within a video active learning setting. Annotation-cost refers to t...
In this paper, we introduce the FOCAL (Ford-OLIVES Collaboration on Active Learning) dataset which enables the study of the impact of annotation-cost within a video active learning setting. Annotation-cost refers to the time it takes an annotator to label and quality-assure a given video sequence. A practical motivation for active learning research is to minimize annotation-cost by selectively labeling informative samples that will maximize performance within a given budget constraint. However, previous work in video active learning lacks real-time annotation labels for accurately assessing cost minimization and instead operates under the assumption that annotation-cost scales linearly with the amount of data to annotate. This assumption does not take into account a variety of real-world confounding factors that contribute to a nonlinear cost such as the effect of an assistive labeling tool and the variety of interactions within a scene such as occluded objects, weather, and motion of objects. FOChL addresses this discrepancy by providing real annotation-cost labels for 126 video sequences across 69 unique city scenes with a variety of weather, lighting, and seasonal conditions. These videos have a wide range of interactions that are at the intersection of infrastructure-assisted autonomy and autonomous vehicle communities. We show through a statistical analysis of the FOChL dataset that cost is more correlated with a variety of factors beyond just the length of a video sequence. We also introduce a set of conformal active learning algorithms that take advantage of the sequential structure of video data in order to achieve a better trade-off between annotation-cost and performance while also reducing floating point operations (FLOPS) overhead by at least 77.67%. We show how these approaches better reflect how annotations on videos are done in practice through a sequence selection framework. We further demonstrate the advantage of these approaches by introducing two
The efficacy of photovoltaic systems is significantly impacted by electrical production losses attributed to faults. Ensuring the rapid and cost-effective restoration of system efficiency necessitates robust fault det...
详细信息
The efficacy of photovoltaic systems is significantly impacted by electrical production losses attributed to faults. Ensuring the rapid and cost-effective restoration of system efficiency necessitates robust fault detection and diagnosis (FDD) procedures. This study introduces a novel interval-gated recurrent unit (I-GRU) based Bayesian optimization framework for FDD in grid-connected photovoltaic (GCPV) systems. The utilization of an interval-valued representation is proposed to address uncertainties inherent in the systems, the GRU is employed for fault classification, while the Bayesian algorithm optimizes its hyperparameters. Addressing uncertainties through the proposed approach enhances monitoring capabilities, mitigating computational and storage costs associated with sensor uncertainties. The effectiveness of the proposed approach for FDD in GCPV systems is demonstrated using experimental application.
暂无评论