Scenario-based model predictive control (MPC) methods introduce recourse into optimal control and can thus reduce the conservativeness inherent to open-loop robust MPC. However, the uncertainty scenarios are often gen...
详细信息
Scenario-based model predictive control (MPC) methods introduce recourse into optimal control and can thus reduce the conservativeness inherent to open-loop robust MPC. However, the uncertainty scenarios are often generated offline using worst-case uncertainty bounds quantified a priori, limiting the potential gains in control performance. This paper presents a learning-based multistage MPC (msMPC) for systems with hard-to-model dynamics and time-varying plant-model mismatch. Gaussian Processes (GP) are used to learn state- and input-dependent plant-model mismatch in real-time and accordingly adapt the scenario tree online. Due to the increased computational complexity associated with incorporating the GP predictions into the optimal control problem, the learning-based msMPC (LB-msMPC) law is approximated by a deep neural network (DNN) that is cheap-to-evaluate online and has a small memory footprint, which makes it suitable for embedded applications. In addition, we present a novel algorithm for training the DNN-based controller that uses a GP description of the plant-model mismatch to generate closed-loop simulation data, which ensures the LB-msMPC law is evaluated in regions of the state space most relevant to closed-loop operation. The proposed LB-msMPC strategy is demonstrated on a cold atmospheric plasma jet with applications in (bio)materials processing. The simulation results indicate the promise of the approximate LB-msMPC strategy for control of hard-to-model systems with fast dynamics on millisecond timescales. (C) 2020 Elsevier Ltd. All rights reserved.
Accurate load forecasting is essential for power system stability and grid dispatch optimization. However, this task is challenging due to the inherent instability and volatility of the load sequence. To address this ...
详细信息
Accurate load forecasting is essential for power system stability and grid dispatch optimization. However, this task is challenging due to the inherent instability and volatility of the load sequence. To address this problem, this paper proposes a novel load forecasting model that integrates periodicity detection, the variable t-distribution, and the dual attention mechanism. Periodicity detection has been incorporated into the self-attention mechanism for the first time, identifying the most significant period in the raw load sequence. Subsequently, the raw load sequence undergoes processing using empirical wavelet transform, resulting in a series of subsequences. A feature attention mechanism is then employed to extract relevant input features. Furthermore, a novel variable t-distribution distance matrix is introduced into the temporal self-attention mechanism, enhancing the influence of data at identical or nearby positions in other periods based on the length of the most significant period. This modification improves the capacity of the vanilla self-attention mechanism to effectively model the relationship between data at varying distances. The hyperparameters of the variable t-distribution are obtained through bayesian hyperparameter optimization. Empirical evaluations on two datasets with distinct meteorological and load features show that the proposed model outperforms baseline models across all metrics. & COPY;2017 Elsevier Inc. All rights reserved.
Landslides are widely distributed worldwide and often result in tremendous casualties and economic losses, especially in the Loess Plateau of China. Taking Wuqi County in the hinterland of the Loess Plateau as the res...
详细信息
Landslides are widely distributed worldwide and often result in tremendous casualties and economic losses, especially in the Loess Plateau of China. Taking Wuqi County in the hinterland of the Loess Plateau as the research area, using bayesianhyperparameters to optimize random forest and extreme gradient boosting decision trees model for landslide susceptibility mapping, and the two optimized models are compared. In addition, 14 landslide influencing factors are selected, and 734 landslides are obtained according to field investigation and reports from literals. The landslides were randomly divided into training data (70%) and validation data (30%). The hyperparameters of the random forest and extreme gradient boosting decision tree models were optimized using a bayesian algorithm, and then the optimal hyperparameters are selected for landslide susceptibility mapping. Both models were evaluated and compared using the receiver operating characteristic curve and confusion matrix. The results show that the AUC validation data of the bayesian optimized random forest and extreme gradient boosting decision tree model are 0.88 and 0.86, respectively, which showed an improvement of 4 and 3%, indicating that the prediction performance of the two models has been improved. However, the random forest model has a higher predictive ability than the extreme gradient boosting decision tree model. Thus, hyperparameteroptimization is of great significance in the improvement of the prediction accuracy of the model. Therefore, the optimized model can generate a high-quality landslide susceptibility map.
The finger vein recognition system uses blood vessels inside the finger of an individual for identity verification. The public is in favor of a finger vein recognition system over conventional passwords or ID cards as...
详细信息
The finger vein recognition system uses blood vessels inside the finger of an individual for identity verification. The public is in favor of a finger vein recognition system over conventional passwords or ID cards as the biometric technology is harder to forge, misplace, and share. In this study, the histogram of oriented gradients (HOG) features, which are robust against changes in illumination and position, are extracted from the finger vein for personal recognition. To further increase the amount of information that can be used for recognition, different instances of the finger vein, ranging from the index, middle, and ring finger are combined to form a multi-instance finger vein representation. This fusion approach is preferred since it can be performed without requiring additional sensors or feature extractors. To combine different instances of finger vein effectively, score level fusion is adopted to allow greater compatibility among the wide range of matches. Towards this end, two methods are proposed: bayesian optimized support vector machine (SVM) score fusion (BSSF) and bayesian optimized SVM based fusion (BSBF). The fusion results are incrementally improved by optimizing the hyperparameters of the HOG feature, SVM matcher, and the weighted sum of score level fusion using the bayesianoptimization approach. This is considered a kind of knowledge-based approach that takes into account the previous optimization attempts or trials to determine the next optimization trial, making it an efficient optimizer. By using stratified cross-validation in the training process, the proposed method is able to achieve the lowest EER of 0.48% and 0.22% for the SDUMLA-HMT dataset and UTFVP dataset, respectively.
Surfactant-enhanced aquifer remediation (SEAR) is an appropriate method for Dense non-aqueous phase liquids (DNAPLs) remediation. However, due to the high cost of chemicals used, choosing the suitable wells pattern an...
详细信息
Surfactant-enhanced aquifer remediation (SEAR) is an appropriate method for Dense non-aqueous phase liquids (DNAPLs) remediation. However, due to the high cost of chemicals used, choosing the suitable wells pattern and the optimal pumping scenario is necessary. In this study, the SEAR method performance for Regular (convergent) and Inverted (divergent) patterns with different wells numbers have been evaluated. The performance of 5 categories of patterns, including 35 different sub-patterns, was evaluated in a PCE-contaminated aquifer. The results show that the uniformity and appropriate surfactant distribution in the contaminated area significantly improves remediation performance. The distribution of surfactants in Regular patterns was better than Inverted patterns, and Regular patterns had lower remediation duration and cost. The best patterns that achieved a 95 % removal rate at the lowest cost were Regular. To find the optimal pumping scenario, a simulation-optimization model based on the Gaussian process regressor (GPR), as a surrogate model, has been used to reduce the optimization model's computational burden. Nine different kernels were applied and evaluated to find the best GPR. Also, the bayesian hyperparameter optimization (BHO) method was used to optimize the surrogate model, and its performance was compared with the conventional grid search method. The results showed that the use of the Chi(2) kernel and the BHO method are the best choices. A BHO-optimized multi-kernel Gaussian process (BHOMK-GP) model has also been developed, and its performance has been compared with single-kernel GPR surrogate models. The BHOMK-GP model's accuracy was significantly higher than single-kernel GPR models. The test and cross-validation RMSE of the BHOMK-GP model were 0.0385 and 0.0435, respectively. Finally, the optimal remediation scenario has been obtained by substituting the BHOMK-GP model as a surrogate model instead of the SEAR simulation model. The cost of remediat
暂无评论