引用本文:彭冬亮,王天兴.基于GoogLeNet模型的剪枝算法[J].控制与决策,2019,34(6):1259-1264
【打印本页】   【HTML】   【下载PDF全文】   查看/发表评论  【EndNote】   【RefMan】   【BibTex】 附件
←前一篇|后一篇→ 过刊浏览    高级检索
本文已被:浏览 35次   下载 72 本文二维码信息
码上扫一扫!
分享到: 微信 更多
基于GoogLeNet模型的剪枝算法
彭冬亮, 王天兴
(杭州电子科技大学自动化学院,杭州310018)
摘要:
GoogLeNet包含多个并行的卷积层和池化层,极具表现力,但也导致其参数数量冗余和计算量大,解决该问题的根本途径是将网络稀疏化.剪枝算法通过训练网络、修剪低权重连接和再训练网络三步操作,只保留卷积层和完全连接层中的强相关连接,实现简化网络结构和参数数量的效果,获得近似的网络模型,不影响网络后验概率估计的准确性,达到压缩效果.传统计算方式不适合非均匀稀疏数据结构,所提出的阈值剪枝算法设定合适的阈值,将原始GoogLeNet模型中将近1040万参数减少到65万,大约压缩了16倍.原始网络在进行剪枝处理后,准确率会有所降低,但经过少数次迭代,网络的准确率与原始模型不相上下,达到了压缩模型的效果,验证了阈值剪枝算法对改进GoogLeNet模型训练过程的有效性.
关键词:  剪枝算法  GoogLeNet  Inception模块  权重阈值  参数冗余  过拟合
DOI:10.13195/j.kzyjc.2017.1556
分类号:TP3
基金项目:国家自然科学基金项目(61703128);近地探测重点实验室基金项目(614241404030717).
Pruning algorithm based on GoogLeNet model
PENG Dong-liang,WANG Tian-xing
(College of Automation,Hangzhou Dianzi University,Hangzhou310018,China)
Abstract:
GoogLeNet contains a number of parallel convolutional layers and pooled layers, which makes the network highly expressive and leads to redundant and computational quantity of GoogLeNet parameters. The fundamental way to solve the problem is to rarefly network. A pruning algorithm consists of three steps: training the network, pruning the low-weight connection, and retraining the network, retaining the strong correlation between the convolution layer and the fully connected layer. It reduces the number of network structure and parameters to obtain an approximate network model and does not affect the accuracy of the network posterior probability estimation in order to achieve compression. The traditional calculation methods are not suitable for the non-uniform and sparse data structures. The threshold pruning algorithm is proposed, which sets a suitable threshold and reduces the original 10.4 million parameters in the original GoogLeNet model to 650,000, approximately 16 times compressed. After pruning the original network, the accuracy rate will be reduced. Then the accuracy of the network is comparable with the original model through a few iterations, and the effect of the compressed mode is achieved. It is proved that the proposed threshold pruning algorithm can effectively improve the GoogLeNet model training process.
Key words:  pruning algorithm  GoogLeNet  Inception module  weight threshold  parameter redundancy  overfitting

用微信扫一扫

用微信扫一扫