The economy is one of the determinants of how a person can live their life. In this current economic situation, inflation occurs everywhere, causing the prices of necessities to rise. In order to have a decent life, p...
详细信息
data scarcity in low-resource languages can be addressed with word-to-word translations from labeled task data in high-resource languages using bilingual lexicons. However, bilingual lexicons often have limited lexica...
详细信息
Blockchain is a distributed database that multiple parties can maintain and share. This new technology is expected to greatly impact the healthcare industry. It can help address various issues related to patient care....
详细信息
Cardiovascular diseases (CVD) are a prominent contributor to illness and death on a global scale, underscoring the need for precise predictive models to facilitate timely intervention. The present study investigates t...
详细信息
ISBN:
(纸本)9789819765805
Cardiovascular diseases (CVD) are a prominent contributor to illness and death on a global scale, underscoring the need for precise predictive models to facilitate timely intervention. The present study investigates the utilization of deep learning methodologies, specifically Convolutional Neural Networks (CNN) and Long Short-Term Memory networks (LSTM), in the context of predictive modeling of cardiovascular diseases. This study examines the efficacy of three well-known optimization techniques, namely Adam Optimization, RMSprop, and Stochastic Gradient Descent (SGD), within the framework of these neural network architectures. Among the various models based on Convolutional Neural Networks (CNNs), Stochastic Gradient Descent (SGD) has been identified as the optimizer that produces the most favorable outcomes for predicting CVD. The utilization of this optimization technique demonstrated exceptional efficacy in the training of the deep neural network, resulting in superior levels of accuracy, sensitivity, and specificity. On the other hand, it was observed that LSTM-based models exhibited the greatest improvement when utilizing RMSprop optimization. The utilization of RMSprop has been found to have a positive impact on the effectiveness of sequence modeling, resulting in enhanced predictive capabilities for assessing the risk of cardiovascular disease. The efficacy of this technique was demonstrated in its ability to capture temporal dependencies within the dataset, consequently enhancing the predictive capability of the model. The results of this study emphasize the importance of carefully choosing neural network architectures and optimization techniques when constructing predictive models for cardiovascular disease. Customizing the selection of neural network architecture and optimization algorithm according to the unique attributes of the dataset can substantially augment the precision and dependability of CVD risk evaluations. This, in turn, can ultimately lead t
Recent works often assume that Vision-Language Model (VLM) representations are based on visual attributes like shape. However, it is unclear to what extent VLMs prioritize this information to represent concepts. We pr...
详细信息
Fine Tuning Attribute Weighted Naïve Bayes (FTAWNB) is a reliable modified Naïve Bayes model. Even though it is able to provide high accuracy on ordinal data, this model is sensitive to outliers. To improve ...
详细信息
There are many cases where borrowed money by debtors is not returned. It is because the company misjudged in determining the risk of lending. Thus, debtors cannot repay their debts and end up in losses on the company&...
详细信息
In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow *** cloud data centers,fog computing takes more time to run workflow ***,it is essenti...
详细信息
In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow *** cloud data centers,fog computing takes more time to run workflow ***,it is essential to develop effective models for Virtual Machine(VM)allocation and task scheduling in fog computing *** task scheduling,VM migration,and allocation,altogether optimize the use of computational resources across different fog *** process ensures that the tasks are executed with minimal energy consumption,which reduces the chances of resource *** this manuscript,the proposed framework comprises two phases:(i)effective task scheduling using a fractional selectivity approach and(ii)VM allocation by proposing an algorithm by the name of Fitness Sharing Chaotic Particle Swarm Optimization(FSCPSO).The proposed FSCPSO algorithm integrates the concepts of chaos theory and fitness sharing that effectively balance both global exploration and local *** balance enables the use of a wide range of solutions that leads to minimal total cost and makespan,in comparison to other traditional optimization *** FSCPSO algorithm’s performance is analyzed using six evaluation measures namely,Load Balancing Level(LBL),Average Resource Utilization(ARU),total cost,makespan,energy consumption,and response *** relation to the conventional optimization algorithms,the FSCPSO algorithm achieves a higher LBL of 39.12%,ARU of 58.15%,a minimal total cost of 1175,and a makespan of 85.87 ms,particularly when evaluated for 50 tasks.
This research aims to develop a brain tumor detection model by utilizing the machine learning techniques and Convolutional Neural Network (CNN). A significant matter to address is revolving around early detection and ...
详细信息
data valuation quantifies the contribution of each data point to the performance of a machine learning model. Existing works typically define the value of data by its improvement of the validation performance of the t...
data valuation quantifies the contribution of each data point to the performance of a machine learning model. Existing works typically define the value of data by its improvement of the validation performance of the trained model. However, this approach can be impractical to apply in collaborative machine learning and data marketplace since it is difficult for the parties/buyers to agree on a common validation dataset or determine the exact validation distribution a priori. To address this, we propose a distributionally robust data valuation approach to perform data valuation without known/fixed validation distributions. Our approach defines the value of data by its improvement of the distributionally robust generalization error (DRGE), thus providing a worst-case performance guarantee without a known/fixed validation distribution. However, since computing DRGE directly is infeasible, we propose using model deviation as a proxy for the marginal improvement of DRGE (for kernel regression and neural networks) to compute data values. Furthermore, we identify a notion of uniqueness where low uniqueness characterizes low-value data. We empirically demonstrate that our approach outperforms existing data valuation approaches in data selection and data removal tasks on real-world datasets (e.g., housing price prediction, diabetes hospitalization prediction). Copyright 2024 by the author(s)
暂无评论