咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Collapse of deep and narrow ne... 收藏
arXiv

Collapse of deep and narrow neural nets

作     者:Lu, Lu Su, Yanhui Karniadakis, George Em 

作者机构:Division of Applied Mathematics Brown University ProvidenceRI02912 United States College of Mathematics and Computer Science Fuzhou University Fujian Fuzhou350116 China 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2018年

核心收录:

主  题:Chemical activation 

摘      要:Recent theoretical work has demonstrated that deep neural networks have superior performance over shallow networks, but their training is more difficult, e.g., they suffer from the vanishing gradient problem. This problem can be typically resolved by the rectified linear unit (ReLU) activation. However, here we show that even for such activation, deep and narrow neural networks (NNs) will converge to erroneous mean or median states of the target function depending on the loss with high probability. Deep and narrow NNs are encountered in solving partial differential equations with high-order derivatives. We demonstrate this collapse of such NNs both numerically and theoretically, and provide estimates of the probability of collapse. We also construct a diagram of a safe region for designing NNs that avoid the collapse to erroneous states. Finally, we examine different ways of initialization and normalization that may avoid the collapse problem. Asymmetric initializations may reduce the probability of collapse but do not totally eliminate it. Copyright © 2018, The Authors. All rights reserved.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分