一种去注意力机制的动态多层语义感知机
CSTR:
作者:
作者单位:

1. 山东工商学院 计算机科学与技术学院,山东 烟台 264005;2. 山东省高等学校协同创新中心: 未来智能计算,山东 烟台 264005;3. 山东工商学院 山东省高校智能信息处理重点实验室, 山东 烟台 264005;4. 大连海事大学 信息科学技术学院,辽宁 大连 116026

作者简介:

通讯作者:

E-mail: thL01@163.com.

中图分类号:

TP181

基金项目:

国家自然科学基金项目(61976124,61976125,62176140).


A dynamic multi-layer semantics perceptron without attention mechanism
Author:
Affiliation:

1. School of Computer Science and Technology,Shandong Technology and Business University,Yantai 264005, China;2. Co-Innovation Center of Shandong Colleges and Universities: Future Intelligent Computing,Yantai 264005,China;3. Key Laboratory of Intelligent Information Processing in Universities of Shandong,Shandong Technology and Business University,Yantai 264005,China;4. Information Science and Technology College,Dalian Maritime University,Dalian 116026, China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    Transformer在大规模数据集上取得了优异效果,但由于使用多头注意力使得模型过于复杂,且在小规模数据集上效果不理想.对于多头注意力替换的研究在图像处理领域已取得一些成果,但在自然语言处理领域还少有研究.为此,首先提出一种去注意力的多层语义感知机(multi-layer semantics perceptron,MSP)方法,其核心创新是使用token序列转换函数替换编码器中的多头注意力,降低模型复杂度,获得更好的语义表达;然后,提出一种动态深度控制框架(dynamic depth control framework,DDCF),优化模型深度,降低模型复杂度;最后,在MSP方法和DDCF的基础上,提出动态多层语义感知机(dynamic multi-layer semantics perceptron,DMSP)模型,在多种文本数据集上的对比实验结果表明,DMSP既能提升模型分类精度,又能有效降低模型复杂度,与Transformer比较,在模型深度相同的情况下,DMSP模型分类精度大幅提升,同时模型的参数量大幅降低.

    Abstract:

    Transformer has achieved excellent results on large-scale data sets, but it is too complex due to utilizing Multi Head Attention(MHA), and its performance is poor on small-scale data sets. The study on the replacement of MHA is little in the field of natural language processing, although it has made great achievements in the field of image processing. Therefore, a method called multi-layer semantics perceptron(MSP) is proposed. Its major innovation is that instead of MHA, a simple token sequence transformation function is used, thus achieving a better semantic feature representation with lower complexity. Additionally, a dynamic depth control framework (DDCF) is proposed, which is able to optimize the depth of neural networks automatically, as a result the complexity of the model is reduced markedly. Finally, based on the MSP and the DDCF, the dynamic multi-layer semantics perceptron model (DMSP) is proposed. Compared with the Transformer model with same depth, the experimental results on multi-data sets show that the DMSP model achieves better performance significantly, meanwhile, its parameters declines sharply.

    参考文献
    相似文献
    引证文献
引用本文

刘孝炎,唐焕玲,王育林,等.一种去注意力机制的动态多层语义感知机[J].控制与决策,2024,39(2):588-594

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2024-01-18
  • 出版日期: 2024-02-20
文章二维码