Heart disease has been the leading cause of death in recent decades. Every five minutes, one person dies unexpectedly from heart disease. Researchers from all over the world are assisting doctors in diagnosing heart p...
详细信息
Heart disease has been the leading cause of death in recent decades. Every five minutes, one person dies unexpectedly from heart disease. Researchers from all over the world are assisting doctors in diagnosing heart problems. However, machine learning techniques can substantially minimize the number of tests required. The main objective of this paper is to predict the heart disease of a person well in advance to minimise the need for expensive treatment and medicines. The present papers have used some of the finest approaches of machine learning like decision tree and KNN classifier for prediction of heart diseases. These algorithms are quite useful in diagnosing cardiac illness without the use of machinery or labs, by figuring out the problem at a far lower cost. These algorithms are very widely utilized and helpful in other domains. The dataset was obtained from the Kaggle website and contain data of 303 patients and 76 attributes. The accuracy of the KNN along with Decision tree algorithms have been compared with existing work of researchers and found to be more in the detection of the heart disease. These algorithms certainly help in reducing the number of deaths which are happening all around the world.
The surging demand for elderly care services, propelled by the global population’s demographic shift, necessitates innovative approaches for efficient data management and decision-making. This paper introduces a nove...
详细信息
Telemetry and monitoring are important means to ensure the safety of flight test. In order to solve the problem of multi-source data stream time synchronization and integrated monitoring under the multi-airplane coope...
Telemetry and monitoring are important means to ensure the safety of flight test. In order to solve the problem of multi-source data stream time synchronization and integrated monitoring under the multi-airplane cooperative flight test scenario, a multi-data stream time synchronization processing algorithm is designed, and a data and video correlation storage and synchronous playback scheme is proposed. Multidimensional integrated monitoring and online playback have been achieved by adopting the concept of componentization design. The system has been applied to the flight test mission of multi-aircraft cooperation, and the actual application results show that the multi-source data time synchronization algorithm is effective, which greatly improves monitoring experience, and effectively ensures the development of flight test program of the multi-aircraft cooperation.
In order to effectively manage and monitor traffic on the roads, intelligent technologies must be able to detect and tally the number of cars on the road. The technology may be used for controlling and monitoring traf...
详细信息
In order to effectively manage and monitor traffic on the roads, intelligent technologies must be able to detect and tally the number of cars on the road. The technology may be used for controlling and monitoring traffic in both urban areas and on highways, especially in times of heavy congestion or bad weather. It has traditionally been accomplished using sensor data and the standard toolkit for image processing. Recently however, with the help of deep learning-based smart computer vision systems, this process has become highly scalable and dependable. For intelligent traffic monitoring and management of issues like traffic congestion, the information gathered from various surveillance cameras may be utilized to train models that can recognize and track on-road cars, even in low-light and blurry situations brought on by bad weather. Although they all aim to solve the same problem, vehicle identification algorithms all have to deal with one or two very distinct cases. Vehicle detection under haze, dust, sandstorms, snow, and rain, as well as daytime and nocturnal conditions, is the focus of this study. The suggested design is based on the CNN architecture as a starting point, but it adds a spatial pyramid pooling layer and removes several of the Batch Normalization levels. During training, the system achieved an average accuracy of 81% and was able to identify even the tiniest vehicles in the scene.
Digitalization has resulted in accumulation of gigantic amount of data and that too at an alarming rate. processing such a large amount of data is the biggest challenge of the recent world. Extracting meaningful infor...
Digitalization has resulted in accumulation of gigantic amount of data and that too at an alarming rate. processing such a large amount of data is the biggest challenge of the recent world. Extracting meaningful information and ensuring correctness and preciseness of data is the core of current requirements of digital world. The process involving preprocessing of raw data, extracting meaningful information and deducing conclusive evidence for decision making lead to the field named as data mining. data mining is a framework of patterns and rules aiming at extracting the relationship or hidden information from the enormous set of databases. The paper targets the healthcare sector and the role of data mining in it. It also presents the challenges pertaining to the healthcare industry. The current trends and the future scope have also been illustrated.
Although the building of quantum computers has kept making rapid progress in recent years, noise is still the main challenge for any application to leverage the power of quantum computing. Existing works addressing no...
Although the building of quantum computers has kept making rapid progress in recent years, noise is still the main challenge for any application to leverage the power of quantum computing. Existing works addressing noise in quantum devices proposed noise reduction when deploying a quantum algorithm to a specified quantum computer. The reproducibility issue of quantum algorithms has been raised since the noise levels vary on different quantum computers. Importantly, existing works largely ignore the fact that the noise of quantum devices varies as time goes by. Therefore, reproducing the results on the same hardware will even become a problem. We analyze the reproducibility of quantum machine learning (QML) algorithms based on daily model training and execution data collection. Our analysis shows a correlation between our QML models' test accuracy and quantum computer hardware's calibration features. We also demonstrate that noisy simulators for quantum computers are not a reliable tool for quantum machine learning applications.
By evaluating received data packets, network traffic categorization (NTC) identifies distinct categories of applications or traffic data. It is a significant technique in today's communication networks. data input...
详细信息
ISBN:
(数字)9798350371406
ISBN:
(纸本)9798350371413
By evaluating received data packets, network traffic categorization (NTC) identifies distinct categories of applications or traffic data. It is a significant technique in today's communication networks. data input, preprocessing, feature extraction, classification, and performance analysis are all processes in the network traffic classification process. The utility of learning approaches to categorize traffic over a network has been pushed by rapid improvements in machine learning. In the event of consistent datasets, inherent properties of internet networks cause uneven class distributions. This phenomenon is known as class imbalance, and it is gaining popularity in a variety of fields of study. Various network traffic classification and data balancing algorithms are discussed in this study.
Inverse protein folding is a fundamental task in computational protein design, which aims to design protein sequences that fold into the desired backbone structures. While the development of machine learning algorithm...
In recent years, Machine Learning (ML) and Deep Learning (DL) applications have become increasingly important in agriculture, particularly in the classification of fruits based on image data. Fruit recognition and cat...
In recent years, Machine Learning (ML) and Deep Learning (DL) applications have become increasingly important in agriculture, particularly in the classification of fruits based on image data. Fruit recognition and categorization have attained a lot of attention from researchers. Because of the diversity and heterogeneity of fruits, this is a difficult challenge. To solve this, a Stacked Bi-LSTM image classification approach has been proposed to enhance the accuracy of fruit image classification. The process includes feature extraction by a Convolutional Neural Network (CNN), followed by feature selection using a Recurrent Neural Network (RNN). Following that, the proposed Stacked Bi-LSTM model is utilized to classify fruit images, resulting in enhanced accuracy. As indicated by evaluation parameters like as recall, precision, and F-measure, the acquired results outperform previous algorithms. The analyzed outcomes show that the Stacked Bi-LSTM model accomplishes precision of 85.4%, recall of 88.3%, and F1-score of 83.9%, demonstrating greater performance when compared to existing approaches.
Cloud computing has emerged as a highly innovative technology, offering various benefits to users. To ensure the security of user data, cloud storage schemes have been introduced, aiming to protect sensitive informati...
详细信息
ISBN:
(数字)9789819970933
ISBN:
(纸本)9789819970926
Cloud computing has emerged as a highly innovative technology, offering various benefits to users. To ensure the security of user data, cloud storage schemes have been introduced, aiming to protect sensitive information from unauthorized access. In particular, the sharing of personal health records (PHR) is gaining prominence as a method for securely sharing healthcare data among users on cloud servers. The use of fuzzy techniques plays a crucial role in transforming original data into an encrypted form, known as ciphertext, which is challenging for unauthorized individuals to comprehend. This technique enhances the confidentiality of data, ensuring that only authorized parties can access and understand it. However, while cloud services provide a convenient platform for data sharing, they often lack efficiency in terms of data sharing capabilities. To address these challenges, a novel approach called fuzzy identity biometric encryption (FIBE) is introduced for PHR management. FIBE combines the benefits of fuzzy techniques and biometric authentication to achieve both high-security levels and user convenience simultaneously. This approach enables authorized users to have control over access to PHR data and ensures secure data sharing within a cloud environment. By integrating biometric authentication, FIBE enhances the security of PHR systems, as biometric characteristics are unique to each individual, making it difficult for unauthorized users to gain access. Moreover, the approach improves user convenience by eliminating the need for remembering complex passwords or using traditional authentication methods. In conclusion, the utilization of fuzzy identity biometric encryption (FIBE) in PHR systems offers enhanced security and efficient data sharing in the cloud. This approach combines the advantages of fuzzy techniques and biometric authentication, providing authorized users with control over data access while maintaining a high level of security and user convenienc
暂无评论