Cardiovascular disease(CVD)remains a leading global health challenge due to its high mortality rate and the complexity of early diagnosis,driven by risk factors such as hypertension,high cholesterol,and irregular puls...
详细信息
Cardiovascular disease(CVD)remains a leading global health challenge due to its high mortality rate and the complexity of early diagnosis,driven by risk factors such as hypertension,high cholesterol,and irregular pulse *** diagnostic methods often struggle with the nuanced interplay of these risk factors,making early detection *** this research,we propose a novel artificial intelligence-enabled(AI-enabled)framework for CVD risk prediction that integrates machine learning(ML)with eXplainable AI(XAI)to provide both high-accuracy predictions and transparent,interpretable *** to existing studies that typically focus on either optimizing ML performance or using XAI separately for local or global explanations,our approach uniquely combines both local and global interpretability using Local Interpretable Model-Agnostic Explanations(LIME)and SHapley Additive exPlanations(SHAP).This dual integration enhances the interpretability of the model and facilitates clinicians to comprehensively understand not just what the model predicts but also why those predictions are made by identifying the contribution of different risk factors,which is crucial for transparent and informed decision-making in *** framework uses ML techniques such as K-nearest neighbors(KNN),gradient boosting,random forest,and decision tree,trained on a cardiovascular ***,the integration of LIME and SHAP provides patient-specific insights alongside global trends,ensuring that clinicians receive comprehensive and actionable *** experimental results achieve 98%accuracy with the Random Forest model,with precision,recall,and F1-scores of 97%,98%,and 98%,*** innovative combination of SHAP and LIME sets a new benchmark in CVD prediction by integrating advanced ML accuracy with robust interpretability,fills a critical gap in existing *** framework paves the way for more explainable and transparent decision-making in he
This study investigates the combined berth allocation problem (BAP) and quay crane allocation problem (QCAP) while considering a multi-quay setting. First, a mixed integer linear programming mathematical model is deve...
详细信息
In this paper,we analyze a hybrid Heterogeneous Cellular Network(HCNet)framework by deploying millimeter Wave(mmWave)small cells with coexisting traditional sub-6GHz macro cells to achieve improved coverage and high d...
详细信息
In this paper,we analyze a hybrid Heterogeneous Cellular Network(HCNet)framework by deploying millimeter Wave(mmWave)small cells with coexisting traditional sub-6GHz macro cells to achieve improved coverage and high data *** consider randomly-deployed macro base stations throughout the network whereas mmWave Small Base Stations(SBSs)are deployed in the areas with high User Equipment(UE)*** user centric deployment of mmWave SBSs inevitably incurs correlation between UE and *** a realistic scenario where the UEs are distributed according to Poisson cluster process and directional beamforming with line-of-sight and non-line-of-sight transmissions is adopted for mmWave *** using tools from stochastic geometry,we develop an analytical framework to analyze various performance metrics in the downlink hybrid HCNets under biased received power *** UE clustering we considered Thomas cluster process and derive expressions for the association probability,coverage probability,area spectral efficiency,and energy *** also provide Monte Carlo simulation results to validate the accuracy of the derived ***,we analyze the impact of mmWave operating frequency,antenna gain,small cell biasing,and BSs density to get useful engineering insights into the performance of hybrid mmWave *** results show that network performance is significantly improved by deploying millimeter wave SBS instead of microwave BS in hot spots.
The increasing use of cloud-based devices has reached the critical point of cybersecurity and unwanted network *** environments pose significant challenges in maintaining privacy and *** approaches,such as IDS,have be...
详细信息
The increasing use of cloud-based devices has reached the critical point of cybersecurity and unwanted network *** environments pose significant challenges in maintaining privacy and *** approaches,such as IDS,have been developed to tackle these ***,most conventional Intrusion Detection System(IDS)models struggle with unseen cyberattacks and complex high-dimensional *** fact,this paper introduces the idea of a novel distributed explainable and heterogeneous transformer-based intrusion detection system,named INTRUMER,which offers balanced accuracy,reliability,and security in cloud settings bymultiplemodulesworking together within *** traffic captured from cloud devices is first passed to the TC&TM module in which the Falcon Optimization Algorithm optimizes the feature selection process,and Naie Bayes algorithm performs the classification of *** selected features are classified further and are forwarded to the Heterogeneous Attention Transformer(HAT)*** this module,the contextual interactions of the network traffic are taken into account to classify them as normal or malicious *** classified results are further analyzed by the Explainable Prevention Module(XPM)to ensure trustworthiness by providing interpretable *** the explanations fromthe classifier,emergency alarms are transmitted to nearby IDSmodules,servers,and underlying cloud devices for the enhancement of preventive *** experiments on benchmark IDS datasets CICIDS 2017,Honeypots,and NSL-KDD were conducted to demonstrate the efficiency of the INTRUMER model in detecting network trafficwith high accuracy for different *** outperforms state-of-the-art approaches,obtaining better performance metrics:98.7%accuracy,97.5%precision,96.3%recall,and 97.8%*** results validate the robustness and effectiveness of INTRUMER in securing diverse cloud environments against sophisticated cyber threats.
The agriculture industry's production and food quality have been impacted by plant leaf diseases in recent years. Hence, it is vital to have a system that can automatically identify and diagnose diseases at an ini...
详细信息
The proposed work objective is to adapt Online social networking (OSN) is a type of interactive computer-mediated technology that allows people to share information through virtual networks. The microblogging feature ...
详细信息
The proposed work objective is to adapt Online social networking (OSN) is a type of interactive computer-mediated technology that allows people to share information through virtual networks. The microblogging feature of Twitter makes cyberspace prominent (usually accessed via the dark web). The work used the datasets and considered the Scrape Twitter Data (Tweets) in Python using the SN-Scrape module and Twitter 4j API in JAVA to extract social data based on hashtags, which is used to select and access tweets for dataset design from a profile on the Twitter platform based on locations, keywords, and hashtags. The experiments contain two datasets. The first dataset has over 1700 tweets with a focus on location as a keypoint (hacking-for-fun data, cyber-violence data, and vulnerability injector data), whereas the second dataset only comprises 370 tweets with a focus on reposting of tweet status as a keypoint. The method used is focused on a new system model for analysing Twitter data and detecting terrorist attacks. The weights of susceptible keywords are found using a ternary search by the Aho-Corasick algorithm (ACA) for conducting signature and pattern matching. The result represents the ACA used to perform signature matching for assigning weights to extracted words of tweet. ML is used to evaluate Twitter data for classifying patterns and determining the behaviour to identify if a person is a terrorist. SVM (Support Vector Machine) proved to be a more accurate classifier for predicting terrorist attacks compared to other classifiers (KNN- K-Nearest Neighbour and NB-Naïve Bayes). The 1st dataset shows the KNN-Acc. -98.38% and SVM Accuracy as 98.85%, whereas the 2nd dataset shows the KNN-Acc. -91.68% and SVM Accuracy as 93.97%. The proposed work concludes that the generated weights are classified (cyber-violence, vulnerability injector, and hacking-for-fun) for further feature classification. Machine learning (ML) [KNN and SVM] is used to predict the occurrence and
Software-defined networking(SDN) is a trending networking paradigm that focuses on decoupling of the control logic from the data plane. This decoupling brings programmability and flexibility for the network management...
详细信息
Software-defined networking(SDN) is a trending networking paradigm that focuses on decoupling of the control logic from the data plane. This decoupling brings programmability and flexibility for the network management by introducing centralized infrastructure. The complete control logic resides in the controller, and thus it becomes the intellectual and most important entity of the SDN infrastructure. With these advantages, SDN faces several security issues in various SDN layers that may prevent the growth and global adoption of this groundbreaking technology. Control plane exhaustion and switch buffer overflow are examples of such security issues. Distributed denial-of-service(DDoS) attacks are one of the most severe attacks that aim to exhaust the controller’s CPU to discontinue the whole functioning of the SDN network. Hence, it is necessary to design a quick as well as accurate detection scheme to detect the attack traffic at an early stage. In this paper, we present a defense solution to detect and mitigate spoofed flooding DDoS attacks. The proposed defense solution is implemented in the SDN controller. The detection method is based on the idea of an statistical measure — Interquartile Range(IQR). For the mitigation purpose, the existing SDN-in-built capabilities are utilized. In this work, the experiments are performed considering the spoofed SYN flooding attack. The proposed solution is evaluated using different performance parameters, i.e., detection time, detection accuracy, packet_in messages, and CPU utilization. The experimental results reveal that the proposed defense solution detects and mitigates the attack effectively in different attack scenarios.
Drug-target interactions(DTIs) prediction plays an important role in the process of drug *** computational methods treat it as a binary prediction problem, determining whether there are connections between drugs and t...
详细信息
Drug-target interactions(DTIs) prediction plays an important role in the process of drug *** computational methods treat it as a binary prediction problem, determining whether there are connections between drugs and targets while ignoring relational types information. Considering the positive or negative effects of DTIs will facilitate the study on comprehensive mechanisms of multiple drugs on a common target, in this work, we model DTIs on signed heterogeneous networks, through categorizing interaction patterns of DTIs and additionally extracting interactions within drug pairs and target protein pairs. We propose signed heterogeneous graph neural networks(SHGNNs), further put forward an end-to-end framework for signed DTIs prediction, called SHGNN-DTI,which not only adapts to signed bipartite networks, but also could naturally incorporate auxiliary information from drug-drug interactions(DDIs) and protein-protein interactions(PPIs). For the framework, we solve the message passing and aggregation problem on signed DTI networks, and consider different training modes on the whole networks consisting of DTIs, DDIs and PPIs. Experiments are conducted on two datasets extracted from Drug Bank and related databases, under different settings of initial inputs, embedding dimensions and training modes. The prediction results show excellent performance in terms of metric indicators, and the feasibility is further verified by the case study with two drugs on breast cancer.
This paper presents a data-driven variable reduction approach to accelerate the computation of large-scale transmission-constrained unit commitment(TCUC).Lagrangian relaxation(LR)and mixed-integer linear programming(M...
详细信息
This paper presents a data-driven variable reduction approach to accelerate the computation of large-scale transmission-constrained unit commitment(TCUC).Lagrangian relaxation(LR)and mixed-integer linear programming(MILP)are popular approaches to solving ***,with many binary unit commitment variables,LR suffers from slow convergence and MILP presents heavy computation *** proposed data-driven variable reduction approach consists of offline and online calculations to accelerate computational performance of the MILP-based large-scale TCUC problems.A database including multiple nodal net load intervals and the corresponding TCUC solutions is first built offline via the data-driven and all-scenario-feasible(ASF)approaches,which is then leveraged to efficiently solve new TCUC instances ***/off statuses of considerable units can be fixed in the online calculation according to the database,which would reduce the computation burden while guaranteeing good solution quality for new TCUC instances.A feasibility proposition is proposed to promptly check the feasibility of the new TCUC instances with fixed binary variables,which can be used to dynamically tune parameters of binary variable fixing strategies and guarantee the existence of feasible UC solutions even when system structure *** tests illustrate the efficiency of the proposed approach.
Remote intelligence in the application of robotics and the autonomous system relies heavily on seamless wireless connections. The 5G mobile network technology meets traditional manufacturing enterprises' applicati...
详细信息
Remote intelligence in the application of robotics and the autonomous system relies heavily on seamless wireless connections. The 5G mobile network technology meets traditional manufacturing enterprises' application requirements for wireless networks based on robot transformation and upgrading, robot interconnection, and remote interactive applications in production. However, there exists many challenging 5G communication issues, such as different communication protocols in the system varies with different robots and no dis-ruptive changes to the physical layer.
暂无评论