%0 Journal Article %T 基于GoogLeNet模型的剪枝算法 %T Pruning algorithm based on GoogLeNet model %A 彭冬亮 %A 王天兴 %A PENG,Dong liang %A WANG,Tian xing %J 控制与决策 %J Control and Decision %@ 1001-0920 %V 34 %N 6 %D 2019 %P 1259-1264 %K 剪枝算法;GoogLeNet;Inception模块;权重阈值;参数冗余;过拟合 %K pruning algorithm;GoogLeNet;Inception module;weight threshold;parameter redundancy;overfitting %X GoogLeNet包含多个并行的卷积层和池化层,极具表现力,但也导致其参数数量冗余和计算量大,解决该问题的根本途径是将网络稀疏化.剪枝算法通过训练网络、修剪低权重连接和再训练网络三步操作,只保留卷积层和完全连接层中的强相关连接,实现简化网络结构和参数数量的效果,获得近似的网络模型,不影响网络后验概率估计的准确性,达到压缩效果.传统计算方式不适合非均匀稀疏数据结构,所提出的阈值剪枝算法设定合适的阈值,将原始GoogLeNet模型中将近1040万参数减少到65万,大约压缩了16倍.原始网络在进行剪枝处理后,准确率会有所降低,但经过少数次迭代,网络的准确率与原始模型不相上下,达到了压缩模型的效果,验证了阈值剪枝算法对改进GoogLeNet模型训练过程的有效性. %X GoogLeNet contains a number of parallel convolutional layers and pooled layers, which makes the network highly expressive and leads to redundant and computational quantity of GoogLeNet parameters. The fundamental way to solve the problem is to rarefly network. A pruning algorithm consists of three steps: training the network, pruning the low-weight connection, and retraining the network, retaining the strong correlation between the convolution layer and the fully connected layer. It reduces the number of network structure and parameters to obtain an approximate network model and does not affect the accuracy of the network posterior probability estimation in order to achieve compression. The traditional calculation methods are not suitable for the non-uniform and sparse data structures. The threshold pruning algorithm is proposed, which sets a suitable threshold and reduces the original 10.4 million parameters in the original GoogLeNet model to 650,000, approximately 16 times compressed. After pruning the original network, the accuracy rate will be reduced. Then the accuracy of the network is comparable with the original model through a few iterations, and the effect of the compressed mode is achieved. It is proved that the proposed threshold pruning algorithm can effectively improve the GoogLeNet model training process. %R 10.13195/j.kzyjc.2017.1556 %U http://kzyjc.alljournals.cn/kzyjc/home %1 JIS Version 3.0.0