引用本文:赵小龙,杨燕.基于邻域粒化条件熵的增量式属性约简算法[J].控制与决策,2019,34(10):2061-2072
【打印本页】   【HTML】   【下载PDF全文】   查看/发表评论  【EndNote】   【RefMan】   【BibTex】 附件
←前一篇|后一篇→ 过刊浏览    高级检索
本文已被:浏览 59次   下载 94 本文二维码信息
码上扫一扫!
分享到: 微信 更多
基于邻域粒化条件熵的增量式属性约简算法
赵小龙1, 杨燕2
(1. 安徽工业经济职业技术学院计算机与艺术学院,合肥230051;2. 西南交通大学信息科学与技术学院,成都610031)
摘要:
增量式属性约简是针对动态型数据的一种重要的数据挖掘方法,目前已提出的增量式属性约简算法大多基于离散型数据构建,很少有对数值型数据进行相关的研究.鉴于此,提出一种数值型信息系统中对象不断增加的增量式属性约简算法.首先,在数值型信息系统中建立一种分层的邻域粒化计算方法,并基于该方法提出邻域粒化的增量式计算;然后,在邻域粒化增量式计算的基础上给出邻域粒化条件熵的增量式更新方法,并基于该更新机制提出对应的增量式属性约简算法;最后,通过实验分析表明所提出算法对于数值型数据的增量式属性约简具有更高的有效性和优越性.
关键词:  增量式学习  粒计算  属性约简  数值型数据  邻域粒化  条件熵
DOI:10.13195/j.kzyjc.2018.0138
分类号:TP18
基金项目:安徽省高校自然科学研究重点项目(KJ2016A107,KJ2017A645);安徽省高校质量工程项目(2016 JXTD019,2015GXK123.)
Incremental attribute reduction algorithm based on neighborhood granulation conditional entropy
ZHAOXiao-long1,YANGYan2
(1. College of Computer and Art,Anhui Technical College of Industry and Economy,Heifei230051,China;2. School of Information Science & Technology,Southwest Jiaotong University,Chengdu610031,China)
Abstract:
Incremental attribute reduction is an important data mining method for dynamic data. The incremental attribute reduction algorithms proposed at present are mostly based on discrete data construction, but the related study for numeric data is few. Therefore, an incremental attribute reduction algorithm for object constantly increasing in numeric information system is presented. Firstly, a hierarchical neighborhood computing method is established in numeric information system, and the incremental computing of neighborhood granulation based on this method is proposed. Then, on the basis of neighborhood granulation incremental computing, the incremental updating method of neighborhood granulation conditional entropy is given, and the corresponding incremental attribute reduction algorithm is proposed on account of this updating mechanism. Finally, experimental analysis shows that the proposed algorithm has higher effectiveness and superiority for the incremental attribute reduction of numerical data.
Key words:  incremental learning  granular computing  attribute reduction  numeric data  neighborhood granulation  conditional entropy

用微信扫一扫

用微信扫一扫