Recommender systems are effective in mitigating information overload, yet the centralized storage of user data raises significant privacy concerns. Cross-user federated recommendation(CUFR) provides a promising distri...
详细信息
Recommender systems are effective in mitigating information overload, yet the centralized storage of user data raises significant privacy concerns. Cross-user federated recommendation(CUFR) provides a promising distributed paradigm to address these concerns by enabling privacy-preserving recommendations directly on user devices. In this survey, we review and categorize current progress in CUFR, focusing on four key aspects: privacy, security, accuracy, and efficiency. Firstly,we conduct an in-depth privacy analysis, discuss various cases of privacy leakage, and then review recent methods for privacy protection. Secondly, we analyze security concerns and review recent methods for untargeted and targeted *** untargeted attack methods, we categorize them into data poisoning attack methods and parameter poisoning attack methods. For targeted attack methods, we categorize them into user-based methods and item-based methods. Thirdly,we provide an overview of the federated variants of some representative methods, and then review the recent methods for improving accuracy from two categories: data heterogeneity and high-order information. Fourthly, we review recent methods for improving training efficiency from two categories: client sampling and model compression. Finally, we conclude this survey and explore some potential future research topics in CUFR.
The Internet of Everything(IoE)based cloud computing is one of the most prominent areas in the digital big data *** approach allows efficient infrastructure to store and access big real-time data and smart IoE service...
详细信息
The Internet of Everything(IoE)based cloud computing is one of the most prominent areas in the digital big data *** approach allows efficient infrastructure to store and access big real-time data and smart IoE services from the *** IoE-based cloud computing services are located at remote locations without the control of the data *** data owners mostly depend on the untrusted Cloud Service Provider(CSP)and do not know the implemented security *** lack of knowledge about security capabilities and control over data raises several security *** Acid(DNA)computing is a biological concept that can improve the security of IoE big *** IoE big data security scheme consists of the Station-to-Station Key Agreement Protocol(StS KAP)and Feistel cipher *** paper proposed a DNA-based cryptographic scheme and access control model(DNACDS)to solve IoE big data security and access *** experimental results illustrated that DNACDS performs better than other DNA-based security *** theoretical security analysis of the DNACDS shows better resistance capabilities.
In the age of technology, many people have fallen victim to fake images. Photo editing has become easier as the process of making photos becomes more efficient. With the image processing tools at their disposal, peopl...
详细信息
The development of the Internet of Things(IoT)technology is leading to a new era of smart applications such as smart transportation,buildings,and smart ***,these applications act as the building blocks of IoT-enabled ...
详细信息
The development of the Internet of Things(IoT)technology is leading to a new era of smart applications such as smart transportation,buildings,and smart ***,these applications act as the building blocks of IoT-enabled smart *** high volume and high velocity of data generated by various smart city applications are sent to flexible and efficient cloud computing resources for ***,there is a high computation latency due to the presence of a remote cloud *** computing,which brings the computation close to the data source is introduced to overcome this *** an IoT-enabled smart city environment,one of the main concerns is to consume the least amount of energy while executing tasks that satisfy the delay *** efficient resource allocation at the edge is helpful to address this *** this paper,an energy and delay minimization problem in a smart city environment is formulated as a bi-objective edge resource allocation ***,we presented a three-layer network architecture for IoT-enabled smart ***,we designed a learning automata-based edge resource allocation approach considering the three-layer network architecture to solve the said bi-objective minimization *** Automata(LA)is a reinforcement-based adaptive decision-maker that helps to find the best task and edge resource *** extensive set of simulations is performed to demonstrate the applicability and effectiveness of the LA-based approach in the IoT-enabled smart city environment.
Nowadays, Android-based devices such as smart phones, tablets, smart watches, and virtual reality headsets have found increasing use in our daily lives. Along with the development of various applications for these dev...
详细信息
Desertification greatly affects land deterioration, farming efficiency, economic growth, and health, especially in Gulf nations. Climate change has worsened desertification, making developmental issues in the area eve...
详细信息
Desertification greatly affects land deterioration, farming efficiency, economic growth, and health, especially in Gulf nations. Climate change has worsened desertification, making developmental issues in the area even more difficult. This research presents an enhanced framework utilizing the Internet of Things (IoT) for ongoing monitoring, data gathering, and analysis to evaluate desertification patterns. The framework utilizes Bayesian Belief Networks (BBN) to categorize IoT data, while a low-latency processing method on edge computing platforms enables effective detection of desertification trends. The classified data is subsequently analyzed using an Artificial Neural Network (ANN) optimized with a Genetic Algorithm (GA) for forecasting decisions. Using cloud computing infrastructure, the ANN-GA model examines intricate data connections to forecast desertification risk elements. Moreover, the Autoregressive Integrated Moving Average (ARIMA) model is employed to predict desertification over varied time intervals. Experimental simulations illustrate the effectiveness of the suggested framework, attaining enhanced performance in essential metrics: Temporal Delay (103.68 s), Classification Efficacy—Sensitivity (96.44 %), Precision (95.56 %), Specificity (96.97 %), and F-Measure (96.69 %)—Predictive Efficiency—Accuracy (97.76 %) and Root Mean Square Error (RMSE) (1.95 %)—along with Reliability (93.73 %) and Stability (75 %). The results of classification effectiveness and prediction performance emphasize the framework's ability to detect high-risk zones and predict the severity of desertification. This innovative method improves the comprehension of desertification processes and encourages sustainable land management practices, reducing the socio-economic impacts of desertification and bolstering at-risk ecosystems. The results of the study hold considerable importance for enhancing regional efforts in combating desertification, ensuring food security, and formulatin
Effective management of electricity consumption (EC) in smart buildings (SBs) is crucial for optimizing operational efficiency, cost savings, and ensuring sustainable resource utilization. Accurate EC prediction enabl...
详细信息
作者:
Raut, YashasviChaudhri, Shiv Nath
Faculty of Engineering and Technology Department of Computer Science and Engineering Maharashtra India
Faculty of Engineering and Technology Department of Computer Science and Design Maharashtra India
Gas and biosensors are crucial in the modern healthcare system, enabling non-invasive monitoring and diagnosis of various medical conditions. These sensors are used in various applications, including smart home health...
详细信息
Brain tumor classification is crucial for personalized treatment *** deep learning-based Artificial Intelligence(AI)models can automatically analyze tumor images,fine details of small tumor regions may be overlooked d...
详细信息
Brain tumor classification is crucial for personalized treatment *** deep learning-based Artificial Intelligence(AI)models can automatically analyze tumor images,fine details of small tumor regions may be overlooked during global feature ***,we propose a brain tumor Magnetic Resonance Imaging(MRI)classification model based on a global-local parallel dual-branch *** global branch employs ResNet50 with a Multi-Head Self-Attention(MHSA)to capture global contextual information from whole brain images,while the local branch utilizes VGG16 to extract fine-grained features from segmented brain tumor *** features from both branches are processed through designed attention-enhanced feature fusion module to filter and integrate important ***,to address sample imbalance in the dataset,we introduce a category attention block to improve the recognition of minority *** results indicate that our method achieved a classification accuracy of 98.04%and a micro-average Area Under the Curve(AUC)of 0.989 in the classification of three types of brain tumors,surpassing several existing pre-trained Convolutional Neural Network(CNN)***,feature interpretability analysis validated the effectiveness of the proposed *** suggests that the method holds significant potential for brain tumor image classification.
Digital twinning enables manufacturers to create digital representations of physical entities,thus implementing virtual simulations for product *** efforts of digital twinning neglect the decisive consumer feedback in...
详细信息
Digital twinning enables manufacturers to create digital representations of physical entities,thus implementing virtual simulations for product *** efforts of digital twinning neglect the decisive consumer feedback in product development stages,failing to cover the gap between physical and digital *** work mines real-world consumer feedbacks through social media topics,which is significant to product *** specifically analyze the prevalent time of a product topic,giving an insight into both consumer attention and the widely-discussed time of a *** primary body of current studies regards the prevalent time prediction as an accompanying task or assumes the existence of a preset ***,these proposed solutions are either biased in focused objectives and underlying patterns or weak in the capability of generalization towards diverse *** this end,this work combines deep learning and survival analysis to predict the prevalent time of *** propose a specialized deep survival model which consists of two *** first module enriches input covariates by incorporating latent features of the time-varying text,and the second module fully captures the temporal pattern of a rumor by a recurrent network ***,a specific loss function different from regular survival models is proposed to achieve a more reasonable *** experiments on real-world datasets demonstrate that our model significantly outperforms the state-of-the-art methods.
暂无评论