咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >MultiAdapt: A Neural Network A... 收藏

MultiAdapt: A Neural Network Adaptation For Pruning Filters Base on Multi-layers Group

作     者:Jie Yang Zhihong Xie Ping Li 

作者机构:School of Computer and Communication Engineering Changsha University of Science and Technology Changsha Hunan Changsha 410114 China Hunan Provincial Key Laboratory of Intelligent Processing of Big Data on Transportation Changsha University of Science and Technology Changsha Hunan Changsha 410114 China 

出 版 物:《Journal of Physics: Conference Series》 

年 卷 期:2021年第1873卷第1期

学科分类:07[理学] 0702[理学-物理学] 

摘      要:Deep convolutional neural networks have been widely used in various AI applications. The most advanced neural networks are becoming deeper and wider, which has caused some large convolutional neural networks to exceed the size limit of the server or application. The pruning algorithm provides a way to reduce the size of the neural network while keeping the accuracy as high as possible. The automatic progressive pruning algorithm is one of the widely used pruning algorithms. The progressive pruning algorithm prunes a certain layer of the network in each iteration to reduce the sparsity while preserving the accuracy as much as possible. In this article, we design a new automatic progressive pruning algorithm named MultiAdapt. MultiAdapt combines the combination method and the greedy algorithm. This multi-layers progressive pruning method greatly increases the search space of the greedy algorithm, making it possible to obtain a better pruning network. We use MultiAdapt to prune large neural networks VGG-16 and ResNet. The experimental results show that the MultiAdapt algorithm is better than other mainstream methods in the balance of neural network model size and accuracy. For image classification tasks on the ImageNet dataset, our method achieved 88.72% and 90.55% TOP-5 accuracy on the 50% sparsity VGG-16 and ResNet, while obtaining nearly 2×reduction in parameters and floating point numbers. The operation is reduced, and the reduction is higher than the recent popular method.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分