基于反卷积特征提取的深度卷积神经网络学习
作者:
作者单位:

(中国矿业大学信息与控制工程学院,江苏徐州221116)

作者简介:

吕恩辉(1989-), 男, 博士生, 从事深度学习的研究;程玉虎(1973-), 男, 教授,博士生导师, 从事机器学习、模式识别与智能系统等研究.

通讯作者:

E-mail: chengyuhu@163.com

中图分类号:

TP18

基金项目:

国家自然科学基金项目(61472424,61772532).


Deep convolution neural network learning based on deconvolution feature extraction
Author:
Affiliation:

(School of Information and Control Engineering,China University of Mining and Technology,Xuzhou 221116,China)

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    在深度卷积神经网络的学习过程中,卷积核的初始值通常是随机赋值的.另外,基于梯度下降法的网络参数学习法通常会导致梯度弥散现象.鉴于此,提出一种基于反卷积特征提取的深度卷积神经网络学习方法.首先,采用无监督两层堆叠反卷积神经网络从原始图像中学习得到特征映射矩阵;然后,将该特征映射矩阵作为深度卷积神经网络的卷积核,对原始图像进行逐层卷积和池化操作;最后,采用附加动量系数的小批次随机梯度下降法对深度卷积网络微调以避免梯度弥散问题.在MNIST、CIFAR-10和CIFAR-100数据集上的实验结果表明,所提出方法可有效提高图像分类精度.

    Abstract:

    During the learning process of the deep convolution neural network(DCNN), the initial values of convolution kernels are usually randomly assigned. In addition, the learning rule of network parameters based on gradient descent usually results in gradient vanishing phenomenon. Aiming at the above problems, a learning method for the DCNN based on deconvolution feature extraction is proposed. Firstly, an unsupervised two-layer stacked deconvolution neural network is used to learn feature mapping matrixes from the original images. Then, the learned feature mapping matrixes are used as the convolution kernels to convolve and pool with the images in a layer-wise manner. Finally, the DCNN is fine-tuned by using the mini-batch stochastic gradient descent method with a momentum coefficient, which can avoid the gradient vanishing problem. Experimental results on MNIST, CIFAR-10 and CIFAR-100 data sets show that, the proposed method can effectively improve the accuracy of image classification.

    参考文献
    相似文献
    引证文献
引用本文

吕恩辉,王雪松,程玉虎.基于反卷积特征提取的深度卷积神经网络学习[J].控制与决策,2018,33(3):447-454

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2018-03-06
  • 出版日期: