基于DLSR的归纳式迁移学习
作者:
作者单位:

江南大学

作者简介:

通讯作者:

中图分类号:

TP181

基金项目:

国家自然科学基金项目(面上项目,重点项目,重大项目)


DLSR based Inductive Transfer Learning Method
Author:
Affiliation:

Jiangnan University

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    由于传统机器学习方法的有效性依赖于大量的有效训练数据,而这难以满足,因此迁移学习被广泛研究并成为近年来的研究热门.针对当前多分类场景训练数据严重不足造成分类性能下降的挑战,提出了一种基于DLSR的归纳式迁移学习方法(TDLSR).该方法基于归纳式迁移学习框架,利用一种知识杠杆机制,从源域中迁移知识并结合目标域数据同时用于模型学习,在保证性能的基础上保护源域数据的安全性.TDLSR继承了DLSR通过扩大不同类别之间的间隔以适用于多分类任务这一特性,且具备DLSR所没有的迁移学习能力.因此保证了学习模型的合理性,更适用于复杂的多分类任务.通过在12个真实UCI数据集上的实验验证了本文方法在应对上述挑战时,具有很好的实验效果.

    Abstract:

    Since the effectiveness of traditional machine learning methods depends on a large amount of effective training data and it is difficult to satisfy, transfer learning has been widely studied and has become a hot research in recent years. In order to meet the challenge that the classification performance is degraded due to the serious shortage of training data in current multiclass classification scenarios, a DLSR based inductive transfer learning method (TDLSR) is proposed. Based on the framework of inductive transfer learning, the proposed method uses a knowledge leverage mechanism to transfer knowledge from source domain and combines the data in target domain for model learning. TDLSR protects the security of the data in source domain on the basis of guaranteed performance. TDLSR inherits the characteristics of DLSR, which is better applicable to multiclass classification tasks by enlarging the distance between different classes. Compared with DLSR, TDLSR has transfer ability which ensures the learned model can be more reasonable, thus it can be well applied to various complex multiclass classification tasks. Experiments on 12 real UCI datasets verify that the proposed method has good experimental results in response to the above challenges.

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2020-06-04
  • 最后修改日期:2021-07-20
  • 录用日期:2020-11-03
  • 在线发布日期: 2020-12-01
  • 出版日期: