This study examines the use of experimental designs, specifically full and fractional factorial designs, for predicting Alzheimer’s disease with fewer variables. The full factorial design systematically investigates ...
详细信息
Scalability and information personal privacy are vital for training and deploying large-scale deep learning *** learning trains models on exclusive information by aggregating weights from various devices and taking ad...
详细信息
Scalability and information personal privacy are vital for training and deploying large-scale deep learning *** learning trains models on exclusive information by aggregating weights from various devices and taking advantage of the device-agnostic environment of web ***,relying on a main central server for internet browser-based federated systems can prohibit scalability and interfere with the training process as a result of growing client ***,information relating to the training dataset can possibly be extracted from the distributed weights,potentially reducing the privacy of the local data used for *** this research paper,we aim to investigate the challenges of scalability and data privacy to increase the efficiency of distributed training *** a result,we propose a web-federated learning exchange(WebFLex)framework,which intends to improve the decentralization of the federated learning *** is additionally developed to secure distributed and scalable federated learning systems that operate in web browsers across heterogeneous ***,WebFLex utilizes peer-to-peer interactions and secure weight exchanges utilizing browser-to-browser web real-time communication(WebRTC),efficiently preventing the need for a main central *** has actually been measured in various setups using the MNIST *** results show WebFLex’s ability to improve the scalability of federated learning systems,allowing a smooth increase in the number of participating devices without central data *** addition,WebFLex can maintain a durable federated learning procedure even when faced with device disconnections and network ***,it improves data privacy by utilizing artificial noise,which accomplishes an appropriate balance between accuracy and privacy preservation.
In a crowd density estimation dataset,the annotation of crowd locations is an extremely laborious task,and they are not taken into the evaluation *** this paper,we aim to reduce the annotation cost of crowd datasets,a...
详细信息
In a crowd density estimation dataset,the annotation of crowd locations is an extremely laborious task,and they are not taken into the evaluation *** this paper,we aim to reduce the annotation cost of crowd datasets,and propose a crowd density estimation method based on weakly-supervised learning,in the absence of crowd position supervision information,which directly reduces the number of crowds by using the number of pedestrians in the image as the supervised *** this purpose,we design a new training method,which exploits the correlation between global and local image features by incremental learning to train the ***,we design a parent-child network(PC-Net)focusing on the global and local image respectively,and propose a linear feature calibration structure to train the PC-Net simultaneously,and the child network learns feature transfer factors and feature bias weights,and uses the transfer factors and bias weights to linearly feature calibrate the features extracted from the Parent network,to improve the convergence of the network by using local features hidden in the crowd *** addition,we use the pyramid vision transformer as the backbone of the PC-Net to extract crowd features at different levels,and design a global-local feature loss function(L2).We combine it with a crowd counting loss(LC)to enhance the sensitivity of the network to crowd features during the training process,which effectively improves the accuracy of crowd density *** experimental results show that the PC-Net significantly reduces the gap between fullysupervised and weakly-supervised crowd density estimation,and outperforms the comparison methods on five datasets of Shanghai Tech Part A,ShanghaiTech Part B,UCF_CC_50,UCF_QNRF and JHU-CROWD++.
Heart monitoring improves life ***(ECGs or EKGs)detect heart *** learning algorithms can create a few ECG diagnosis processing *** first method uses raw ECG and time-series *** second method classifies the ECG by pati...
详细信息
Heart monitoring improves life ***(ECGs or EKGs)detect heart *** learning algorithms can create a few ECG diagnosis processing *** first method uses raw ECG and time-series *** second method classifies the ECG by patient *** third technique translates ECG impulses into Q waves,R waves and S waves(QRS)features using richer *** ECG signals vary naturally between humans and activities,we will combine the three feature selection methods to improve classification accuracy and *** using all three approaches have not been examined till *** researchers found that Machine Learning(ML)techniques can improve ECG *** study will compare popular machine learning techniques to evaluate ECG *** algorithms—Support Vector Machine(SVM),Decision Tree,Naive Bayes,and Neural Network—compare categorization *** plus prior knowledge has the highest accuracy(99%)of the four ML *** characteristics failed to identify signals without chaos *** 99.8%classification accuracy,the Decision Tree technique outperformed all previous experiments.
Effective management of electricity consumption (EC) in smart buildings (SBs) is crucial for optimizing operational efficiency, cost savings, and ensuring sustainable resource utilization. Accurate EC prediction enabl...
详细信息
Dear Editor,The distributed constraint optimization problems(DCOPs) [1]-[3]provide an efficient model for solving the cooperative problems of multi-agent systems, which has been successfully applied to model the real-...
Dear Editor,The distributed constraint optimization problems(DCOPs) [1]-[3]provide an efficient model for solving the cooperative problems of multi-agent systems, which has been successfully applied to model the real-world problems like the distributed scheduling [4], sensor network management [5], [6], multi-robot coordination [7], and smart grid [8]. However, DCOPs were not well suited to solve the problems with continuous variables and constraint cost in functional form, such as the target tracking sensor orientation [9], the air and ground cooperative surveillance [10], and the sensor network coverage [11].
The development of the Internet of Things(IoT)technology is leading to a new era of smart applications such as smart transportation,buildings,and smart ***,these applications act as the building blocks of IoT-enabled ...
详细信息
The development of the Internet of Things(IoT)technology is leading to a new era of smart applications such as smart transportation,buildings,and smart ***,these applications act as the building blocks of IoT-enabled smart *** high volume and high velocity of data generated by various smart city applications are sent to flexible and efficient cloud computing resources for ***,there is a high computation latency due to the presence of a remote cloud *** computing,which brings the computation close to the data source is introduced to overcome this *** an IoT-enabled smart city environment,one of the main concerns is to consume the least amount of energy while executing tasks that satisfy the delay *** efficient resource allocation at the edge is helpful to address this *** this paper,an energy and delay minimization problem in a smart city environment is formulated as a bi-objective edge resource allocation ***,we presented a three-layer network architecture for IoT-enabled smart ***,we designed a learning automata-based edge resource allocation approach considering the three-layer network architecture to solve the said bi-objective minimization *** Automata(LA)is a reinforcement-based adaptive decision-maker that helps to find the best task and edge resource *** extensive set of simulations is performed to demonstrate the applicability and effectiveness of the LA-based approach in the IoT-enabled smart city environment.
High-dimensional and incomplete(HDI) matrices are primarily generated in all kinds of big-data-related practical applications. A latent factor analysis(LFA) model is capable of conducting efficient representation lear...
详细信息
High-dimensional and incomplete(HDI) matrices are primarily generated in all kinds of big-data-related practical applications. A latent factor analysis(LFA) model is capable of conducting efficient representation learning to an HDI matrix,whose hyper-parameter adaptation can be implemented through a particle swarm optimizer(PSO) to meet scalable ***, conventional PSO is limited by its premature issues,which leads to the accuracy loss of a resultant LFA model. To address this thorny issue, this study merges the information of each particle's state migration into its evolution process following the principle of a generalized momentum method for improving its search ability, thereby building a state-migration particle swarm optimizer(SPSO), whose theoretical convergence is rigorously proved in this study. It is then incorporated into an LFA model for implementing efficient hyper-parameter adaptation without accuracy loss. Experiments on six HDI matrices indicate that an SPSO-incorporated LFA model outperforms state-of-the-art LFA models in terms of prediction accuracy for missing data of an HDI matrix with competitive computational ***, SPSO's use ensures efficient and reliable hyper-parameter adaptation in an LFA model, thus ensuring practicality and accurate representation learning for HDI matrices.
The coronavirus disease 2019 (COVID-19) has posed significant challenges globally, with image classification becoming a critical tool for detecting COVID-19 from chest X-ray and CT images. Convolutional neural network...
详细信息
Cyberbullying,a critical concern for digital safety,necessitates effective linguistic analysis tools that can navigate the complexities of language use in online *** tackle this challenge,our study introduces a new ap...
详细信息
Cyberbullying,a critical concern for digital safety,necessitates effective linguistic analysis tools that can navigate the complexities of language use in online *** tackle this challenge,our study introduces a new approach employing Bidirectional Encoder Representations from the Transformers(BERT)base model(cased),originally pretrained in *** model is uniquely adapted to recognize the intricate nuances of Arabic online communication,a key aspect often overlooked in conventional cyberbullying detection *** model is an end-to-end solution that has been fine-tuned on a diverse dataset of Arabic social media(SM)tweets showing a notable increase in detection accuracy and sensitivity compared to existing *** results on a diverse Arabic dataset collected from the‘X platform’demonstrate a notable increase in detection accuracy and sensitivity compared to existing methods.E-BERT shows a substantial improvement in performance,evidenced by an accuracy of 98.45%,precision of 99.17%,recall of 99.10%,and an F1 score of 99.14%.The proposed E-BERT not only addresses a critical gap in cyberbullying detection in Arabic online forums but also sets a precedent for applying cross-lingual pretrained models in regional language applications,offering a scalable and effective framework for enhancing online safety across Arabic-speaking communities.
暂无评论