咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >PRUNING DEEP CONVOLUTIONAL NEU... 收藏
arXiv

PRUNING DEEP CONVOLUTIONAL NEURAL NETWORK USING CONDITIONAL MUTUAL INFORMATION

作     者:Vu-Van, Tien Thanh, Dat Du Ho, Nguyen Vu, Mai 

作者机构:Faculty of Computer Science and Engineering Ho Chi Minh City University of Technology Viet Nam Computer Science Department Loyola University Maryland United States Electrical and Computer Engineering Department Tufts University United States 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2024年

核心收录:

主  题:Convolutional neural networks 

摘      要:Convolutional Neural Networks (CNNs) achieve high performance in image classification tasks but are challenging to deploy on resource-limited hardware due to their large model sizes. To address this issue, we leverage Mutual Information, a metric that provides valuable insights into how deep learning models retain and process information through measuring the shared information between input features or output labels and network layers. In this study, we propose a structured filter-pruning approach for CNNs that identifies and selectively retains the most informative features in each layer. Our approach successively evaluates each layer by ranking the importance of its feature maps based on Conditional Mutual Information (CMI) values, computed using a matrix-based Rényi α-order entropy numerical method. We propose several formulations of CMI to capture correlation among features across different layers. We then develop various strategies to determine the cutoff point for CMI values to prune unimportant features. This approach allows parallel pruning in both forward and backward directions and significantly reduces model size while preserving accuracy. Tested on the VGG16 architecture with the CIFAR-10 dataset, the proposed method reduces the number of filters by more than a third, with only a 0.32% drop in test accuracy. © 2024, CC BY-NC-ND.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分