Cloud computing is gaining popularity in high-performance computing applications. Its utilization enables advanced simulations when local computing resources are limited. However, cloud usage may increase costs and en...
详细信息
Cloud computing is gaining popularity in high-performance computing applications. Its utilization enables advanced simulations when local computing resources are limited. However, cloud usage may increase costs and entail resource unavailability risks. This article presents an original approach that employs machinelearning to predict long-term cloud resource usage. This enables optimizing resource utilization through appropriate reservation plans, reducing the associated costs. The solution developed utilizes statistical models, XGBoost, neural networks and the Temporal Fusion Transformer. Long-term prediction of cloud resource consumption, especially the Cloud Resource Usage Optimization System that is critical for prolonged simulations, involves using prediction results to dynamically create resource reservation plans across various virtual machine types for HPC on the Google Cloud Platform. Experiments using real -life production data demonstrate that the TFT prediction model improved prediction quality (by 31.4%) compared to the best baseline method, particularly in adapting to chaotic changes in resource consumption. However, it should be noted that the best prediction model in terms of error magnitude might not be the most suitable for resource reservation planning. This was validated by the neural network -based method, introducing an FR metric for forecast evaluation. Resource reservation plans were assessed both qualitatively and quantitatively, focusing on various aspects like a service -level agreement compliance and potential downtime. This paper is an extension of work originally presented during the internationalconference on Computational science - ICCS 2023, entitled "Long -Term Prediction of Cloud Resource Usage in High -Performance computing".
A significant amount of interest has been generated in recent years in the convergence of quantum computing and data mining due to quantum algorithms' potential to revolutionize information extraction from vast da...
详细信息
The early detection of defects in mechanical equipment is of paramount importance in the industrial field. Research into data analysis methodologies for extracting valuable data from mechanical equipment has highlight...
详细信息
ISBN:
(纸本)9798350370027;9798350370034
The early detection of defects in mechanical equipment is of paramount importance in the industrial field. Research into data analysis methodologies for extracting valuable data from mechanical equipment has highlighted the significant value of these technologies. One such study is the early and accurate detection of anomalies in the acoustic data of machinery. In this paper, we propose an effective technique for the detection and classification of rare event anomalies in the data derived from the rotational noise of automotive motors. MFCC extraction and smoothing techniques were used to select minimal features for optimal performance, and Principal Component Analysis (PCA) was applied to extract salient features. These features are capable of distinguishing between normal and anomalous data. Additionally, an unsupervised learning algorithm was applied to the dataset to differentiate between normal and anomaly data. Experimental results showed that the proposed method can effectively detect sound anomalies with a high accuracy of 99.4% and is also capable of detailed classification of anomalous data.
This work looks into machinelearning as a means of enhancing rainfall prediction. A logistic regression model is trained using meteorological data, including meteorological parameters such as temperature, wind speed,...
详细信息
Sensors generate a huge amount of data that need to be transferred to a computing device for processing. Such large data transfer takes time and consumes energy. This paper presents a new sensing and computing archite...
详细信息
ISBN:
(纸本)9798350371000;9798350370997
Sensors generate a huge amount of data that need to be transferred to a computing device for processing. Such large data transfer takes time and consumes energy. This paper presents a new sensing and computing architecture, referred to as MLIS (machinelearning in Sensors). MLIS allows a part of machinelearning to be done on sensor board thereby dramatically reducing the amount of data transferred to the computing device and hence improving overall system performance and energy efficiency. Using an energy-based probabilistic graphical model, RBM (Restricted Boltzmann machine), we built a new ADAS (Advanced Driver-Assistance System) computing platform for autonomous driving with phased-array-radar as sensors. A working prototype has been built to provide proof of concept for our new architecture. The prototype is implemented using a TI's mmWave (millimeter Wave) radar board and a Vivado HLS implementation of the RBM on the Xilinx xc7z020-clg400-1 device. Extensive experiments have been carried out using the prototype on realistic scenes on our campus. Experimental results have shown that the proposed architecture can reduce the data to be transferred by a factor of 8 while maintaining 98% accuracy. Based on the experimental settings, we present two case studies that have shown a remarkable reduction in collision probability if applying the new architecture to autonomous vehicles.
Confronted with an abundance of adjustable parameters and ever-shifting workloads, database configuration tuning grapples with persistent challenges. The intricate task of thoroughly optimizing all these configuration...
详细信息
National Basketball Association, namely NBA, is the most hot league. For various purposes, people try to predict the outcome of NBA games. This paper applies the data of NBA matches from 2004-2022 and numerous machine...
详细信息
As machinelearningdata sizes continue to rise, this work offers a unique solution to improve computing efficiency by means of data compression. This starts by going over the problem's history and emphasizing the...
详细信息
The proposed study realizes a novel quantum machinelearning (QML) architecture that allows heuristic function evaluation and can actually perform quantum circuits during massive data processing. The Quantum-Circuit f...
详细信息
machinelearning has developed into a crucial tool for the financial industry, allowing financial organizations to enhance their processes and offerings. This paper gives a general overview of machinelearning's u...
详细信息
暂无评论