This paper summarizes state-of-the-art results on data series processing with the empahsis on parallel and distributed data series indexes that exploit the computational power of modern computing platforms. The paper ...
详细信息
This paper presents a low signal-to-noise ratio (SNR) real-time antenna array diagnosis method with remarkable reduction in measurements via smart sensing of electromagnetic (EM) with learnable data acquisition and pr...
This paper presents a low signal-to-noise ratio (SNR) real-time antenna array diagnosis method with remarkable reduction in measurements via smart sensing of electromagnetic (EM) with learnable data acquisition and processing. Previous techniques such as matrix inversion, exhaustive search, and genetic algorithm are generally time consuming in data acquisition, and/or require complex reconstruction algorithms for a successful data post-processing. This forces limitations on previous techniques, making them inefficient and ineffective for real time array diagnosis. Addressing these shortcomings, we introduce EM sensing by developing data-driven learnable data acquisition with the integration of a data-driven learned dataprocessing pipeline. As a result, measurement technique is jointly learned with a matching post-processing, towards diagnostic operation, consequently giving us opportunity to conduct real-time and accurate antenna array diagnosis, with a remarkable reduction in measurements at low SNR. We illustrate the effectiveness of the developed method using an HFSS-based 10 × 10-array of waveguide in practical noise scenario. The results obtained show the efficiency of the developed approach, even at low SNR.
Federated learning (FL) enables collaborative model training across clients while preserving data privacy. However, FL faces the challenge of data heterogeneity, leading to biased local models that deviate from the gl...
Federated learning (FL) enables collaborative model training across clients while preserving data privacy. However, FL faces the challenge of data heterogeneity, leading to biased local models that deviate from the global model during optimization. And existing FL algorithms like Federated Averaging (FedAvg) suffer from this issue. To address this problem, we propose a novel approach called multi-branch prototype federated learning (FedMBP). FedMBP creates auxiliary branches within each local model to integrate different levels of local and global prototypes, thus preventing local model drift by aligning local prototypes with global ones. We also introduce mixed cross-entropy on the auxiliary branches to effectively transfer global prototype knowledge to local models. We conduct experiments on three publicly available datasets, including natural and medical image domains. Our experiments demonstrate that FedMBP outperforms existing FL algorithms, achieving superior model performance.
The next wave of scientific discovery is predicated on unleashing beyond-exascale simulation capabilities using in-memory computing. Path-based computing is a promising in-memory logic style for accelerating Boolean l...
The next wave of scientific discovery is predicated on unleashing beyond-exascale simulation capabilities using in-memory computing. Path-based computing is a promising in-memory logic style for accelerating Boolean logic with deterministic precision. However, existing studies on path-based computing are limited to executing small combinational circuits. In this paper, we propose a framework called PSYS to accelerate data-intensive scientific computing applications using path-based in-memory systolic arrays. The approach leverages path-based computing for multiplying known constants with an unknown operand, which substantially reduces the computational complexity compared with general purpose multiplication of two unknown operands. The systolic arrays minimize data movement by storing the matrix elements using non-volatile memory and performing processing in-place. The framework decomposes unstructured computations to the systolic arrays while considering the non-regular computational patterns of the applications. Our experimental evaluations employ applications from the domains of engineering, physics, and mathematics. The experimental results demonstrate that compared with the state-of-the-art, the PSYS framework improves energy and latency by a factor of 101x and 23x, respectively.
data mining (DM) and Soft computing (SC) are a vital computational approach that offers good competence of flexible agricultural dataprocessing systems to solve farmer’s problems. Recently, soft computing has emerge...
详细信息
data mining (DM) and Soft computing (SC) are a vital computational approach that offers good competence of flexible agricultural dataprocessing systems to solve farmer’s problems. Recently, soft computing has emerged as a powerful technique for solving and analyzing complex real-world problems. This article suggests an approach of smart crop predictions is presented through DM &SC in the field of agricultural quality crop prediction. A five-level framework is proposed namely 1) Collection of data from different repositories, 2) Pre-processing of data, 3) Appropriate Classifier Selection, 4) Prediction and Estimation 5) Draw AUC & ROC curve. Method proposed here focuses on analyzing agricultural yield, soil for crop, rainfall required based on chemical property of soil. Agricultural data analysis and cataloging is one of the best applications of new computing tools such as soft computing, Machine Learning (ML) approach, became a burning area for the reason that of the massive development of farming data. DM& SC approaches for accomplishing applied research will give effective answers for this type of problem. ML is a working tool to study multiple learners and combine their assessments for accomplishing greater forecasting accurateness. In this investigation, we had recapitulated ML methods which can be applied as an essential tool by the farmer and agriculture scientist for timely prediction of crop production.
Our daily work is getting easier and more convenient with the development of Artificial Intelligence. AI is implemented which helps in conserving users’ time and energy. Generally, with drastic growth in technology f...
Our daily work is getting easier and more convenient with the development of Artificial Intelligence. AI is implemented which helps in conserving users’ time and energy. Generally, with drastic growth in technology focuses in brainstorming new concepts and implementation of those ideas. In supermarkets, to save time and effort of the customers, a cashier-less stress free environment is provided by applying the following idea used in this project. The product is identified by AI and the cart is automatized and managed completely by a computer. It can be implemented using Computer Vision and with some deep learning algorithms like YOLO algorithm.
The practice of conveying medical records to either a Third-Party Cloud Service (TCS) for backups has become increasingly popular in recent decades at least as the healthcare Internet of things has expanded. However, ...
The practice of conveying medical records to either a Third-Party Cloud Service (TCS) for backups has become increasingly popular in recent decades at least as the healthcare Internet of things has expanded. However, when used in many environments, a few little PAEKS schemes also have security issues. People proposed a novel, efficient, but also effective PAEKS scheme in this article. The Diffie-Hellman pow consensus concept is used in this scheme to generate a secret key known only to the receiver and to the sender. We show that our scheme is completely risk-free against inside keyword various attacks in several co-environments when the Cloudera Diffie-Hellman assumption is used. From the analysis, it shows that proposed method achieved better results in terms of accuracy (96.8%), precision (95.2%), recall (95.8%) f1-score (95.5%), memory utilization (80 MB) and execution time (0.7 sec) respectively.
With the exponential growth and rapid development in the fields of deep learning and neural networks, chatbots have gained a lot of popularity and have become a proven, and efficient tool to interact and provide servi...
With the exponential growth and rapid development in the fields of deep learning and neural networks, chatbots have gained a lot of popularity and have become a proven, and efficient tool to interact and provide service to users. Healthcare is one of the most promising fields where chatbots can be used more efficiently. This has become important, especially in the current medical landscape, where there is a shortage of doctors, and patients often have to wait long periods before getting any medical guidance. By using the power of transformer models and machine learning algorithms chatbots can help patients with personalized diagnoses and treatment recommendations, efficiently at ease and convenience. This helps the patients to access medical services anywhere, at any point in time. This paper proposes a well-planned systematic architecture for a medical chatbot that utilizes the potential of transformers, classification algorithms and machine-learning models. The architecture includes three main components: a Naïve Bayes Classifier, a Binary Tree classifier along with a Support Vector Classifier, and a sequence-to-sequence model. These algorithms are used to classify symptoms and determine the severity of a medical condition to provide patients with accurate medical diagnoses and treatment recommendations. Overall, the proposed architecture is built and designed to bridge the gap between doctors and patients by providing immediate access to medical advice, making it a promising tool for improving the quality and accessibility of healthcare services.
The development of growing variety of useful sources of data, the creation of a number of methods for machine learning, and the development of web of things and advanced analytics enable people and policy maker with n...
详细信息
The development of growing variety of useful sources of data, the creation of a number of methods for machine learning, and the development of web of things and advanced analytics enable people and policy maker with new tools for predictive services however an interconnected system also has a deficit that will help in reducing building costs and allow new forms of facilities this is a case in point for an applied big data analysis framework of things and smart grid for web applications, this paper presents a summary and description of big data and the internet of the creation of subjects .This paper provides a comparative analysis of audio and crowd video content an intelligent grid case study that shows the high standards of big data analysis structure .The evaluation processing, analysis and reporting for comprehension and optimization gather and analyzes, of learner data in their contexts and in the condition in which it is generated.
Sentiment analysis, an essential task in natural language processing, plays a pivotal role in understanding sentiment and opinions expressed in textual data. However, with the exponential growth of social media and on...
Sentiment analysis, an essential task in natural language processing, plays a pivotal role in understanding sentiment and opinions expressed in textual data. However, with the exponential growth of social media and online platforms, the sheer volume of textual data presents challenges for efficient processing. Traditional approaches struggle to cope with the increased data size, necessitating the adoption of big dataprocessing techniques. This study presents a comparative performance analysis of sentiment analysis, evaluating the utilization of a big dataprocessing framework. The study compares three machine learning algorithms for sentiment analysis with and without the implementation of big dataprocessing techniques, focusing on model training efficiency. Additionally, two textual feature extraction techniques are examined to assess their impact on the results. Evaluation of the models' performance is based on the average execution time for training. The study's findings indicate that SparkML's Random Forest significantly outperforms the traditional sci-kit learn's Random Forest in terms of training time.
暂无评论