The escalating threat of easily transmitted diseases poses a huge challenge to government institutions and health systems worldwide. Advancements in information and communication technology offer a promising approach ...
详细信息
The escalating threat of easily transmitted diseases poses a huge challenge to government institutions and health systems worldwide. Advancements in information and communication technology offer a promising approach to effectively controlling infectious diseases. This article introduces a comprehensive framework for predicting and preventing zoonotic virus infections by leveraging the capabilities of artificial intelligence and the Internet of Things. The proposed framework employs IoT-enabled smart devices for data acquisition and applies a fog-enabled model for user authentication at the fog layer. Further, the user classification is performed using the proposed ensemble model, with cloud computing enabling efficient information analysis and sharing. The novel aspect of the proposed system involves utilizing the temporal graph matrix method to illustrate dependencies among users infected with the zoonotic flu and provide a nuanced understanding of user interactions. The implemented system demonstrates a classification accuracy of around 91% for around 5000 instances and reliability of around 93%. The presented framework not only aids uninfected citizens in avoiding regional exposure but also empowers government agencies to address the problem more effectively. Moreover, temporal mining results also reveal the efficacy of the proposed system in dealing with zoonotic cases.
Image recognition provides an essential role in many areas of modern life, such as security, video analysis, healthcare, and autonomous vehicles. The present article analyzes the evolution of image recognition via the...
详细信息
Image recognition provides an essential role in many areas of modern life, such as security, video analysis, healthcare, and autonomous vehicles. The present article analyzes the evolution of image recognition via the automation of feature extraction processes, leading to the achievement of very sophisticated outcomes. This is achieved by comparing conventional approaches with deep learning methodologies, particularly Convolutional Neural Networks (CNNs). The integration of CNNs with cloud computing has significantly enhanced the speed, scalability, and usability, enabling the real-time analysis of extremely extensive picture datasets. This article discusses significant algorithms, including CNN architectures such as AlexNet, VGG, and ResNet-based models. Transfer learning techniques and the impact of ensemble approaches are also discussed in these articles. Data privacy, security, and the impact of AI on cloud computing are expected to become more critical in the future.
The most complicated process in multi-cloud computing is resource allocation, as it needs to cope with a number of configurations and constraints of cloud providers and customers. At the time of resource allocation, t...
详细信息
The most complicated process in multi-cloud computing is resource allocation, as it needs to cope with a number of configurations and constraints of cloud providers and customers. At the time of resource allocation, the centralized cloud broker monitors the virtual machines (VM) status, scheduling process, and fitness. However, VM scheduling is found tedious and has received huge attention in business, academia, and research. This enhances the demand for both task scheduling and resource allocation in a multi-cloud environment. To bridge the gap between the consumer requirement and server infrastructure, a joint optimization-based resource allocation and task scheduling concept is analyzed in the proposed framework. The first phase introduces the task scheduling mechanism, which uses Hybrid Electro Search and Beetle Swarm Optimization to determine the optimal task for specific VMs. The optimal selection procedure is done by analyzing a multi-cloud environment's makespan, energy, cost, and throughput parameters. In the second step, an Adaptive Game Theory-based Seagull optimization approach performs several rounds of reassignment iteratively to minimize the variation in the expected completion time, consequently decreasing high energy consumption and load balancing. The experimental analysis for the proposed model is implemented using Python. The proposed methodology is shown to achieve cheaper costs, shorter waiting times, improved resource allocation, and efficient load balancing. Finally, a comparative analysis is performed with some hybrid optimization models, which illustrate the efficiency of the proposed hybrid optimization model.
With the advancement of the cloud computing environment, the users' expectations to gain better services significantly increased. One of the most prominent parts of cloud systems is the task scheduling concept in ...
详细信息
With the advancement of the cloud computing environment, the users' expectations to gain better services significantly increased. One of the most prominent parts of cloud systems is the task scheduling concept in which its improvement can increase the users' satisfaction as a consequence. Most of the published literature in this domain is extended to either a single-objective or bi-objective perspective. This paper presents a heuristic task scheduling algorithm for the optimization of makespan-cost-reliability (TSO-MCR) objectives. In addition, the users' constraints are considered in the proposed optimization model. To this end, the task ranking approach, ignoring the unreliable processors, using Pareto dominance, and crowding distance approaches are utilized so the trade-off amongst potentially conflicting objectives is gained. To verify the effectiveness of the proposed TSO-MCR, its performance is compared with Multi-Objective Heterogeneous Earliest Finish Time (MOHEFT), Cost and Makespan Scheduling of Workflows in the cloud (CMSWC), Hybrid Discrete Cuckoo Search Algorithm (HDCSA), and Multi-Objective Best Fit Decreasing (MOBFD) approaches. Since the comparative algorithms are bi-objectives, the multi-objective version of each is customized commensurate with the stated problem to prepare the same conditions. The simulation results prove that the proposed TSO-MCR significantly outperforms other state-of-the-art. It has 4.23, 8.93, 2.08, and 4.24% improvement against all counterparts in all 12 scenarios respectively in terms of makespan, total monetary cost, reliability, and the final score function incorporating all weighted objectives. It is worth mentioning that the comparison has been done on the datasets including both scientific workflow and random applications with different communication-to-computation ratio (CCR) values.
Networking,storage,and hardware are just a few of the virtual computing resources that the infrastruc-ture service model offers,depending on what the client *** essential aspect of cloud computing that improves resour...
详细信息
Networking,storage,and hardware are just a few of the virtual computing resources that the infrastruc-ture service model offers,depending on what the client *** essential aspect of cloud computing that improves resource allocation techniques is host load *** difficulty means that hardware resource allocation in cloud computing still results in hosting initialization issues,which add several minutes to response *** solve this issue and accurately predict cloud capacity,cloud data centers use prediction *** permits dynamic cloud scalability while maintaining superior service *** host prediction,we therefore present a hybrid convolutional neural network long with short-term memory model in this ***,the suggested hybrid model is input is subjected to the vector auto regression *** data in many variables that,prior to analysis,has been filtered to eliminate linear *** that,the persisting data are processed and sent into the convolutional neural network layer,which gathers intricate details about the utilization of each virtual machine and central processing *** next step involves the use of extended short-term memory,which is suitable for representing the temporal information of irregular trends in time series *** key to the entire process is that we used the most appropriate activation function for this type of model a scaled polynomial constant *** systems require accurate prediction due to the increasing degrees of unpredictability in data *** of this,two actual load traces were used in this study’s assessment of the *** example of the load trace is in the typical dispersed *** comparison to CNN,VAR-GRU,VAR-MLP,ARIMA-LSTM,and other models,the experiment results demonstrate that our suggested approach offers state-of-the-art performance with higher accuracy in both datasets.
cloud computing has gained popularity due to its scalability, cost-effectiveness, on-demand provisioning, pay-as-you-go billing, and enhanced accessibility. Recognizing these benefits, government agencies and industri...
详细信息
cloud computing has gained popularity due to its scalability, cost-effectiveness, on-demand provisioning, pay-as-you-go billing, and enhanced accessibility. Recognizing these benefits, government agencies and industries are increasingly migrating their databases to the cloud. However, despite these strengths, cloud computing faces significant security challenges, including phishing, vulnerability exploitation, Distributed Denial-Of-Service (DDoS) attacks, and unauthorized access. Among these threats, DDoS attacks pose a particularly severe threat, by flooding servers with traffic, disrupting critical cloud services and rendering them inaccessible to legitimate users. In response to evolving threats, researchers are transitioning from traditional signature-based detection methods to machine learning and deep learning for DDoS detection. While these methods are effective for high-rate DDoS attacks, low-rate DDoS poses a challenge due to its subtle integration with legitimate traffic, device heterogeneity, diverse request specifications, and computational costs. To address these challenges, we propose a new hybrid framework called 'AE-CIAM' for the effective detection and classification of low-rate DDoS attacks in cloud environments. Our framework incorporates an Autoencoder (AE) enhanced with attention module, which adeptly extracts relevant features from data, eliminating the need for manual intervention and reducing feature space dimensionality. Unlike deep autoencoders, streamlined autoencoders with attention strike a balance between complexity and computational efficiency, offering versatility across diverse tasks. Subsequently, we employ the Convolutional Neural Networks (CNN) Inception with Attention Mechanism model to categorize attacks into different types of low-rate DDoS attacks, achieving high performance at a low computational cost. The proposed model demonstrates remarkable efficacy, achieving an accuracy exceeding 99.99% for binary classification and 99.
The revolution in improving services to the community carried out by the current government is genuine. It is not easy for government organizations, especially local governments, to directly implement e-government ser...
详细信息
The revolution in improving services to the community carried out by the current government is genuine. It is not easy for government organizations, especially local governments, to directly implement e-government services in full. One solution that is considered appropriate and can solve these problems is the application of cloud computing to support e-government services in local governments. The advantage of cloud computing for e-government is that it can increase security. In contrast, cloud-based storage can be valuable because data stored in cloud computing is guaranteed secure and various regulations and standards of information safety practices. We propose a systematic literature review approach to analyze trends in studies on cloud computing of e-government. This research is a type of descriptive qualitative research using a bibliometric approach. To assess the trend of cloud computing in e-government, we use CiteSpace's latest bibliometric software to achieve a comprehensive knowledge overview of cloud computing in e-government. The Findings of this paper reveal a dynamic scenario influenced by advancing technological progress and administrative priorities. Across the globe, governments are progressively acknowledging the capacity of cloud computing to improve the effectiveness, accessibility, and scalability of e-government services. Overall, challenges persist, spanning from concerns regarding data safekeeping and privacy, but it also signifies a strategic transition towards harnessing digital technologies to provide more agile, citizen-focused public services.
With the expansion of mobile devices and cloud computing, massive spatial trajectory data is generated and outsourced to the cloud for storage and analysis, enabling location-based mobile computing services. However, ...
详细信息
With the expansion of mobile devices and cloud computing, massive spatial trajectory data is generated and outsourced to the cloud for storage and analysis, enabling location-based mobile computing services. However, due to the sensitivity of the trajectory data, sharing it in plaintext could lead to privacy risks, especially in operations like contact queries. Thus, achieving secure and efficient contact queries based on the trajectory data in the cloud is a significant challenge. In this paper, we propose a privacy-preserving contact query processing over trajectory data in mobile cloud computing. The projection-based secure trajectory encoding is designed to convert trajectories into secure codes such that the comparison between the distance of two moving objects and the contact distance threshold is transformed into a problem of secure code matching. Adopting the secure code matching method, a baseline privacy-preserving contact query processing is proposed. To improve the query accuracy and efficiency, an amplification factor, an HTG-index and a filter table are designed for query processing optimization, based on which an enhanced privacy-preserving contact query processing is proposed. The game stimulation-based security analysis and experimental results show that the proposed query scheme is secure and performs well in query accuracy and efficiency.
For accessing required services in cloud computing, user submits its task to the cloud datacentre for processing. Therefore, two challenges have been faced by datacentre controllers such as finding the best resources ...
详细信息
For accessing required services in cloud computing, user submits its task to the cloud datacentre for processing. Therefore, two challenges have been faced by datacentre controllers such as finding the best resources and mapping user tasks to virtual machines (VMs). To solve these issues, this paper presented a scheduling algorithm named as Modified Parallel Particle Swarm Optimization (MPPSO). This algorithm is based on the Parallel PSO algorithm which reduces the processing time and dynamically adjust the load of each VM that VM can take part in task processing. By using the cloudSim simulator, MPPSO approach is tested against Parallel Particle Swarm Optimization (PPSO) and Modified Particle Swarm Optimization (MPSO) algorithm by taking different task and VM sets. From the result our proposed algorithm reduce execution time, makespan time and waiting time by 16%, 15% and 19% while increase the throughput and fitness function value by 16% and 17% respectively.
As a Big Data analysis technique, hierarchical clustering is helpful in summarizing data since it returns the clusters of the data and their clustering history. cloud computing is the most suitable option to efficient...
详细信息
As a Big Data analysis technique, hierarchical clustering is helpful in summarizing data since it returns the clusters of the data and their clustering history. cloud computing is the most suitable option to efficiently perform hierarchical clustering over numerous data. However, since compromised cloud service providers can cause serious privacy problems by revealing data, it is necessary to solve the problems prior to using the external cloud computing service. Privacy-preserving hierarchical clustering protocol in an outsourced computing environment has never been proposed in existing works. Existing protocols have several problems that limit the number of participating data owners or disclose the information of data. In this article, we propose a parallelly running and privacy-preserving agglomerative hierarchical clustering (ppAHC) over the union of datasets of multiple data owners in an outsourced computing environment, which is the first protocol to the best of our knowledge. The proposed ppAHC does not disclose any information about input and output, including the data access patterns. The proposed ppAHC is highly efficient and suitable for Big Data analysis to handle numerous data since its cost for one round is independent of the amount of data. It allows data owners without sufficient computing capability to participate in a collaborative hierarchical clustering.
暂无评论