咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Enhancing performance of the b... 收藏

Enhancing performance of the backpropagation algorithm via sparse response regularization

经由稀少的反应规则化提高 backpropagation 算法的性能

作     者:Zhang, Jiangshe Ji, Nannan Liu, Junmin Pan, Jiyuan Meng, Deyu 

作者机构:Xi An Jiao Tong Univ Sch Math & Stat Xian 710049 Peoples R China 20th Inst China Elect Technol Grp Corp Xian 710068 Peoples R China 

出 版 物:《NEUROCOMPUTING》 (神经计算)

年 卷 期:2015年第153卷

页      面:20-40页

核心收录:

学科分类:08[工学] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:National Basic Research Program of China (973 Program) [2013CB329404] National Natural Science Foundation of China [91230101, 61075006, 11131006] 

主  题:Feed-forward artificial neural network Backpropagation Human nervous system Regularization 

摘      要:The backpropagation (BP) algorithm is the most commonly utilized training strategy for a feed-forward artificial neural network (FFANN). The BP algorithm, however, always leads to the problems of low convergence rate, high energy and poor generalization capability of FFANN. In this paper, motivated by the sparsity property of human neuron responses, we introduce a new sparse-response BP (SRBP) to improve the capacity of a FFANN by enforcing sparsity to its hidden units through imposing a supplemental L-1 term on them. The FFANN model learned from our algorithm is closely related to the real human and thus its mechanism fully complies with the human nervous system, i.e., sparse representation and architectural depth. Experiments on several datasets demonstrate that SRBP yields good performances on convergence rate, energy saving and generalization capability. (C) 2014 Elsevier B.V. All rights reserved.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分