Radial Basis Function Neural Network(RBFNN)ensembles have long suffered from non-efficient training,where incorrect parameter settings can be computationally *** paper examines different evolutionary algorithms for tr...
详细信息
Radial Basis Function Neural Network(RBFNN)ensembles have long suffered from non-efficient training,where incorrect parameter settings can be computationally *** paper examines different evolutionary algorithms for training the Symbolic Radial Basis Function Neural Network(SRBFNN)through the behavior’s integration of satisfiability *** by evolutionary algorithms,which can iteratively find the nearoptimal solution,different Evolutionary Algorithms(EAs)were designed to optimize the producer output weight of the SRBFNN that corresponds to the embedded logicprogramming 2satisfiability representation(SRBFNN-2SAT).The SRBFNN’s objective function that corresponds to satisfiability logic programming can be minimized by different algorithms,including Genetic Algorithm(GA),Evolution Strategy Algorithm(ES),Differential Evolution Algorithm(DE),and Evolutionary programming Algorithm(EP).Each of these methods is presented in the steps in the flowchart form which can be used for its straightforward implementation in any programming *** the use of SRBFNN-2SAT,a training method based on these algorithms has been presented,then training has been compared among algorithms,which were applied in Microsoft Visual C++software using multiple metrics of performance,including Mean Absolute Relative Error(MARE),Root Mean Square Error(RMSE),Mean Absolute Percentage Error(MAPE),Mean Bias Error(MBE),Systematic Error(SD),Schwarz Bayesian Criterion(SBC),and Central Process Unit time(CPU time).Based on the results,the EP algorithm achieved a higher training rate and simple structure compared with the rest of the *** has been confirmed that the EP algorithm is quite effective in training and obtaining the best output weight,accompanied by the slightest iteration error,which minimizes the objective function of SRBFNN-2SAT.
暂无评论