Deep neural networks (DNN) have gained remarkable success on many rainfall predictions tasks in recent years. However, the performance of DNN highly relies upon the hyperparameter setting. In order to design DNNs with...
详细信息
Deep neural networks (DNN) have gained remarkable success on many rainfall predictions tasks in recent years. However, the performance of DNN highly relies upon the hyperparameter setting. In order to design DNNs with the best performance, extensive expertise in both the DNN and the problem domain under investigation is required. But many DNN users have not met this requirement. Therefore, it is difficult for the users who have no extensive expertise in DNN to design optimal DNN architectures for their rainfall prediction problems that is to solve. In this paper, we proposed a novel automatic hyperparameters optimization method for DNN by using an improved Gene Expression Programming. The proposed method can automatically optimize the hyperparameters of DNN for precipitation modeling and prediction. Extensive experiments are conducted with three real precipitation datasets to verify the performance of the proposed algorithm in terms of four metrics, including MAE, MSE, RMSE, and R-Squared. The results show that: 1) the DNN optimized by the proposed method outperforms the existing precipitation prediction methods including Multiple Linear Regression (MLR), Back Propagation (BP), Support Vector Machine (SVM), Random Forest (RF) and DNN;2) the proposed DNN hyperparameter optimization method outperforms state-of-the-art DNN hyperparameter optimization methods, including Genetic Algorithm, Bayes Search, Grid Search, Randomized Search, and Quasi Random Search.
The paper presents the results of the research on neural architecture search (NAS) algorithm. We utilized the hill climbing algorithm to search for well-performing structures of deep convolutional neural network. More...
详细信息
ISBN:
(纸本)9781728109336
The paper presents the results of the research on neural architecture search (NAS) algorithm. We utilized the hill climbing algorithm to search for well-performing structures of deep convolutional neural network. Moreover, we used the function preserving transformations which enabled the effective operation of the algorithm in a short period of time. The network obtained with the advantage of NAS was validated on skin lesion classification problem. We compared the parameters and performance of the automatically generated neuralstructure with the architectures selected manually, reported by the authors in previous papers. The obtained structure achieved comparable results to hand-designed networks, but with much fewer parameters then manually crafted architectures.
暂无评论