基于偏差的图注意力神经网络推荐算法
作者:
作者单位:

河南理工大学

作者简介:

通讯作者:

中图分类号:

TP273

基金项目:


A bias - based graph attention neural network recommender algorithm
Author:
Affiliation:

Henan Polytechnic Universtiry

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    在推荐系统中,基于知识图谱的神经网络与传统神经网络相比,以图形作为输入,可以很好地将节点信息和拓扑结构相结合进行推理和推荐.然而现有的基于图神经网络的推荐算法,面临着知识表示不准确以及信息融合单一的问题.本文将图神经网络与注意力机制相结合,提出一种基于偏差的图注意力神经网络推荐算法.该方法采用翻译模型对知识图谱信息进行特征表示的嵌入,获取节点在同一投影空间下的三元组信息,考虑到在三元组中预测值和真实值之间存在误差,以及邻居节点在信息传播时权重的差异,采用基于偏差的注意力计算机制以便能够更好地捕获节点间高阶连通性.其次,在神经网络的传播训练过程中,通过多通道融合机制对节点和邻居信息进行聚合以提高模型的健壮性.最后,在两个真实数据集上与经典算法进行对比,验证了本文所提的算法有效性.

    Abstract:

    In the recommender system, compared with the traditional neural network, the neural network based on the knowledge graph can combine node information and topological structure to infer and recommend with grpah as input. However, the existing recommender algorithm based on graph neural networks faces the problem of inaccurate knowledge representation and single information fusion. Combining graph neural network with attention, and the bias-based graph attention neural network recommender algorithm(BGANR) is proposed. First, the translation model is used in the feature representation to gain the triple information of the node in the same projection space. Considering the error between the predicted value and the true value in the triple, and the difference in the weight of neighboring nodes during information dissemination, the bias-based attention method is used to better capture the high-order connectivity of nodes. Then, in the propagation training of the neural network, the node and neighbor information is aggregated through the multi-channel fusion mechanism to increase the robustness of the model. Finally, comparison with the state-of-the-art algorithms on two real datasets verifies the effectiveness of BGANR the algorithm.

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2020-11-24
  • 最后修改日期:2021-12-26
  • 录用日期:2021-04-21
  • 在线发布日期: 2021-05-15
  • 出版日期: