咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >H∞-learning of layered neural ... 收藏

H∞-learning of layered neural networks

作     者:Nishiyama, K Suzuki, K 

作者机构:Iwate Univ Fac Engn Dept Comp & Informat Sci Morioka Iwate 0208551 Japan 

出 版 物:《IEEE TRANSACTIONS ON NEURAL NETWORKS》 (IEEE Trans Neural Networks)

年 卷 期:2001年第12卷第6期

页      面:1265-1277页

核心收录:

主  题:backpropagation H infinity filter H infinity-learning Kalman filter learning algorithm neural network robust estimation 

摘      要:Although the backpropagation (BP) scheme is widely used as a learning algorithm for multilayered neural networks, the learning speed of the BP algorithm to obtain acceptable errors is unsatisfactory in spite of sonic improvements such as introduction of a momentum factor and an adaptive learning rate in the weight adjustment. To solve this problem, a fast learning algorithm based on the extended Kalman filter (EKF) is presented and fortunately its computational complexity has been reduced by some simplifications. In general, however, the Kalman filtering algorithm is well known to be sensitive to the nature of noises which is generally assumed to be Gaussian. In addition, the H-infinity theory suggests that the maximum energy gain of the Kalman algorithm from disturbances (initial state, system, and observation noises) to the estimation error has no upper bound. That is, the Kalman filtering algorithm has a poor robustness to such disturbances. Therefore, the EKF-based learning algorithms should be further improved to enhance the robustness to variations in the initial values of link weights and thresholds as well as to the nature of noises. The aim of this paper is to propose H-infinity-learning as a novel learning rule and to derive new globally and locally optimized learning algorithms based on H.-learning. Their learning behavior is analyzed from various points of view using computer simulations. The derived algorithms are also compared, in performance and computational cost, with the conventional BP and EKF learning algorithms.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分