版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Univ Western Ontario Dept Elect & Comp Engn London ON N6A 5B9 Canada Univ New Orleans Dept Elect Engn New Orleans LA 70148 USA
出 版 物:《IEEE TRANSACTIONS ON NEURAL NETWORKS》 (IEEE Trans Neural Networks)
年 卷 期:1999年第10卷第4期
页 面:930-938页
核心收录:
基 金:ASFC, (94E53186) National Science Foundation, NSF, (1996-99-RD-A-32, ECS-9409358, ECS-9734285) Office of Naval Research, ONR, (N0014–97–1–0570) National Natural Science Foundation of China, NSFC, (69274015)
主 题:BP algorithm extended Kalman filter feedforward neural networks forgetting factor U-D factorization
摘 要:A fast learning algorithm for training multilayer feedforward neural networks (FNN s) by using a fading memory extended Kalman filter (FMEKF) is presented first, along with a technique using a self-adjusting time-varying forgetting factor. Then a U-D factorization-based FMEKF is proposed to further improve the learning rate and accuracy of the FNN, In comparison with the backpropagation (BP) and existing EKF-based learning algorithms, the proposed U-D factorization-based FMEKF algorithm provides much more accurate learning results, : using fewer hidden nodes. It has improved convergence rate and numerical stability (robustness). In addition, it is less sensitive to start-up parameters (e.g., initial weights and covariance matrix) and the randomness in the observed data. It also has good generalization ability and needs less training time to achieve a specified learning accuracy. Simulation results in modeling and identification of nonlinear dynamic systems are given to show the effectiveness and efficiency of the proposed algorithm.