咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Introduction of orthonormal tr... 收藏

Introduction of orthonormal transform into neural filter for accelerating convergence speed

作     者:Nakanishi, I Itoh, Y Fukui, Y 

作者机构:Tottori Univ Fac Educ & Reg Sci Tottori 6808551 Japan Tottori Univ Fac Engn Tottori 6808551 Japan 

出 版 物:《IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES》 (电子信息通信学会汇刊:电子学、通信及计算机科学基础)

年 卷 期:2000年第E83A卷第2期

页      面:367-370页

核心收录:

学科分类:0808[工学-电气工程] 0809[工学-电子科学与技术(可授工学、理学学位)] 08[工学] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

主  题:transform domain neural filter neural filter multilayer neural networks back-propagation algorithm normalized step size 

摘      要:As the nonlinear adaptive filter: the neural filter is utilized to process the nonlinear signal and/or system. However, the neural filter requires large number of iterations for convergence. This letter presents a new structure of the multilayer neural filter where the orthonormal transform is introduced into all inter-layers to accelerate the convergence speed. The proposed structure is called the transform domain neural filter (TDNF) for convenience. The weights are basically updated by the Back-Propagation (BP) algorithm but it must be modified since the error back-propagates through the orthogonal transform. Moreover, the variable step size which is normalized by the transformed signal power is introduced into the BP algorithm to realize the orthonormal transform. Through the computer simulation, it is confirmed that the introduction of the orthonormal transform is effective for speedup of convergence in the neural filter.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分