Event detection is a computational process that enables the automatic identification of major events by analyzing media data. An event refers to a significant occurrence that takes place at a specific time and locatio...
详细信息
Event detection is a computational process that enables the automatic identification of major events by analyzing media data. An event refers to a significant occurrence that takes place at a specific time and location. Many researchers have focused on predicting certain events through social media analysis. These events include disease outbreaks, election results, stock market trends, the frequency of article citations, product sales, and sports competition outcomes. The noisy nature of media content requires innovative semantic techniques to ensure accurate analysis of media streams. Consequently, improving the accuracy of event detection methods by addressing the noisy characteristics of media content is critical. Recent studies have explored event detection and recognition through social media analysis. This paper introduces a chaotic sparrow search algorithm with Deep Learning for Event Detection and Classification (CSSA-DLEDC) in social media. The primary goal of the CSSA-DLEDC technique is to identify and classify the presence of events and non-events. To achieve this, the technique involves data pre-processing and the TF-IDF word embedding process. The detection of events in social media is then performed using a Deep Belief Network (DBN). Finally, the hyperparameters of the DBN model are tuned using the chaotic sparrow search algorithm (CSSA).To validate the improved performance of the CSSA-DLEDC technique, extensive experimental simulations were conducted. The results demonstrated the superior effectiveness of the CSSA-DLEDC technique in the event detection process on social media.
Anomaly Detection (AD) systems play a crucial role in identifying potential cyber-attacks or data breaches by recognizing patterns of irregular data within the Internet of Things (IoT). Standard Machine Learning (ML) ...
详细信息
The stable operation of power system has the strong constraint of load balance. Accurate power load forecasting is of great significance in ensuring power system planning and reliable and economic operation. For this ...
详细信息
The stable operation of power system has the strong constraint of load balance. Accurate power load forecasting is of great significance in ensuring power system planning and reliable and economic operation. For this purpose, a novel power load forecasting integrating variational modal decomposition (VMD), t-distributed stochastic neighbor embedding dimension reduction visualization analysis (t-SNE), compound prediction models adopting least squares support vector machine (LSSVM) and Tent mapping function as well as chaotic sparrow search algorithm (CSSA), is proposed in this paper. To begin with, for the high-dimensional meteorological data affecting the power load forecasting, the t-SNE is adopted. Meanwhile, the comparison experiments with five common dimensional reduction algorithms prove that t-SNE can better map high-dimensional meteorological data to low-dimensional space. Then, the VMD is used to decompose the electricity load, which decomposes the non-stationary electricity load series into multiple sets of relatively stationary sub-series. Meanwhile, key parameters in the LSSVM model are optimized using the CSSA optimization algorithm under Tent chaotic perturbation, and the component is predicted by optimized LSSVM model. Finally, the ultimate forecasting results of the electricity load are calculated by superimposing the predicted values of all components. The experiments results reveal that the proposed model provides competitive advantages over other models and offers greater prediction accuracy.
This paper studies the three-dimensional (3-D) dynamic trajectory tracking control of an autonomous underwater vehicle (AUV). As AUV is a typical nonlinear system, each degree of freedom is strongly coupled, so the tr...
详细信息
This paper studies the three-dimensional (3-D) dynamic trajectory tracking control of an autonomous underwater vehicle (AUV). As AUV is a typical nonlinear system, each degree of freedom is strongly coupled, so the traditional control method based on the nominal model of AUV cannot guarantee the accuracy of the control system. To solve this problem, we first propose a prediction model based on a radial basis function neural network (RBF-NN). The nonlinearity of AUV is learned and modeled offline by RBF-NN based on previous data. This model can reflect the time sequence state and control variables of AUV. Secondly, to avoid the overfitting problem in network training based on the traditional gradient descent method, a new adaptive chaotic sparrow search algorithm (ACSSA) is proposed to optimize the network parameters, to improve the full approximation ability of RBF-NN to nonlinear systems. To eliminate the steady-state error caused by external interference during AUV trajectory tracking, a nonlinear optimizer is designed by updating the deviation of the NN model output layer. In each sampling period, the predictive control law is calculated online according to the deviation between the predicted value and the actual value. In addition, the stability analysis based on the Lyapunov method proves the asymptotic stability of the controller. Finally, the 3-D dynamic trajectory tracking the performance of AUV under different external disturbances is verified by MATLAB/Simulink, and the results show that the proposed controller is more efficient and robust than the standard model predictive controller (MPC) controller and the standard NN model predictive controller (NNPC). In this paper, by designing a new RBF-NN prediction model based on novel SSA parameter optimization and combining it with the dynamics constraints of AUV, the prediction model is designed as a state prediction controller for AUV, which is successfully applied in the field of dynamic 3-D trajectory tracki
As one of the key technologies in battery management system, accurate remaining useful life (RUL) prediction is critical to guarantee the reliability and safety for electrical equipment. However, the generalization an...
详细信息
As one of the key technologies in battery management system, accurate remaining useful life (RUL) prediction is critical to guarantee the reliability and safety for electrical equipment. However, the generalization and robustness of a single method are limited. A novel fusion data-driven RUL prediction method CSSA-ELM-LSSVR based on charging-discharging health features extraction is proposed in this paper, which fusions chaotic sparrow search algorithm (CSSA), extreme learning machine (ELM), and least squares support vector regression (LSSVR). First, four health indicators (HIs) are extracted from the charging-discharging process, which can reflect the battery degradation phenomenon from multiple perspectives. Then, pearson correlation coefficient is used to numerically analyze the correlation between HIs and battery aging capacities. Second, the extracted HIs are used as the inputs for ELM and LSSVR to predict the degradation trend of battery, where CSSA is used for hyperparameters optimization in ELM. Finally, considering that CSSA-ELM can capture the general trend of degradation curves, while LSSVR can trace the detail changes, a fusion framework based on CSSA-ELM and LSSVR is proposed for RUL prediction. Two weighting schemes, namely precision-based weighting (PW) and random forest regressor-based weighting (RFRW) are put forward to fix the weights of CSSA-ELM and LSSVR algorithms. Two publicly available datasets from National Aeronautics and Space Administration (NASA) and MIT are adopted to verify the feasibility and effectiveness of the proposed method. The results indicate that the proposed method with any weighting scheme has an overall superior prediction performance for different kinds of batteries compared with CSSA-ELM, LSSVR, convolution neural network and long short term memory. Moreover, the RFRW scheme has better overall performance. Specifically, the maximum root mean square error of the predicted method is 2.5126%, the mean absolute percentage err
This study proposes a new method for crude oil future price forecasting. The original crude oil futures price series is decomposed into a series of sub-sequences using the improved complete ensemble empirical mode dec...
详细信息
This study proposes a new method for crude oil future price forecasting. The original crude oil futures price series is decomposed into a series of sub-sequences using the improved complete ensemble empirical mode decomposition with adaptive noise (ICEEMDAN) method, and the permutation entropy (PE) method is employed to reconstruct these sub-sequences into high-frequency, low-frequency, and trend components. Using the kernel extreme learning machine (KELM) optimised by the chaotic sparrow search algorithm (CSSA), the low-frequency component and trend component are predicted. However, the high-frequency component is decomposed secondary to the empirical mode decomposition (EMD) method, and the PE and CSSA-KELM models are employed again to obtain the linear integrating prediction result for the high-frequency component. Finally, the forecasting results of the high-frequency, low-frequency, and trend components are nonlinearly integrated with the CSSAKELM model, and the final forecasting value for crude oil futures prices is obtained. To verify the effectiveness of the proposed model, we empirically forecast the Brent and WTI crude oil futures prices. The empirical results show that the approach proposed in this study improves forecasting accuracy compared to other benchmark models and has good robustness.
暂无评论