咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >A roulette wheel-based pruning... 收藏

A roulette wheel-based pruning method to simplify cumbersome deep neural networks

作     者:Chan, Kit Yan Yiu, Ka Fai Cedric Guo, Shan Jiang, Huimin 

作者机构:School of Electrical Engineering Computing and Mathematics Sciences Curtin University Bentley Australia Department of Applied Maths The Hong Kong Polytechnic University Hong Kong School of Business Macau University of Science and Technology China 

出 版 物:《Neural Computing and Applications》 (Neural Comput. Appl.)

年 卷 期:2024年第36卷第22期

页      面:13915-13933页

核心收录:

学科分类:0710[理学-生物学] 08[工学] 0835[工学-软件工程] 0802[工学-机械工程] 0836[工学-生物工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:The second author is supported by RGC Grant PolyU 15203923  the Projects of Strategic Importance (1-ZE1Y)  Faculty of Science Dean’s Reserve (1-ZVT5)  and Projects (4-ZZPT  1-WZ0E) of The Hong Kong Polytechnic University  Hong Kong 

主  题:Deep neural networks 

摘      要:Deep neural networks (DNNs) have been applied in many pattern recognition or object detection applications. DNNs generally consist of millions or even billions of parameters. These demanding computational storage and requirements impede deployments of DNNs in resource-limited devices, such as mobile devices, micro-controllers. Simplification techniques such as pruning have commonly been used to slim DNN sizes. Pruning approaches generally quantify the importance of each component such as network weight. Weight values or weight gradients in training are commonly used as the importance metric. Small weights are pruned and large weights are kept. However, small weights are possible to be connected with significant weights which have impact to DNN outputs. DNN accuracy can be degraded significantly after the pruning process. This paper proposes a roulette wheel-like pruning algorithm, in order to simplify a trained DNN while keeping the DNN accuracy. The proposed algorithm generates a branch of pruned DNNs which are generated by a roulette wheel operator. Similar to the roulette wheel selection in genetic algorithms, small weights are more likely to be pruned but they can be kept;large weights are more likely to be kept but they can be pruned. The slimmest DNN with the best accuracy is selected from the branch. The performance of the proposed pruning algorithm is evaluated by two deterministic datasets and four non-deterministic datasets. Experimental results show that the proposed pruning algorithm generates simpler DNNs while DNN accuracy can be kept, compared to several existing pruning approaches. © The Author(s) 2024.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分