粗等价类双边剪枝策略下多次Hash 的约简算法
CSTR:
作者:
作者单位:

广东工业大学管理学院,广州510520.

作者简介:

赵洁

通讯作者:

中图分类号:

TP311

基金项目:

国家自然科学基金项目(71401045);教育部人文社会科学基金项目(12YJCZH129).


Rough equivalence class based attribute reduction algorithm with bilateral-pruning strategies and multiple Hashing
Author:
Affiliation:

School of Management,Guangdong University of Technology,Guangzhou 510520,China.

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    提出一种新的约简算法. 首先以全局等价类为最小计算粒度, 提出粗等价类概念, 深入研究其性质并证明粗等价类下求核和约简与原决策系统等价; 剖析3 类粗等价类与正区域间的内在关联, 设计针对1 和??1 两类粗等价类双边删减下正区域的渐增式等价计算方法, 从而设计双向剪枝策略以及多次Hash 的属性增量划分算法, 基于此给出高效完备的约简算法. 最后用UCI 中20 个决策集、海量、超高维3 类数据集从多个角度进行验证, 结果表明, 所提出的约简算法的完备性和高效性在绝大多数情况下优于现有算法, 尤其适用于海量数据和超高维数据集.

    Abstract:

    A new attribute reduction algorithm is proposed. Firstly, the rough equivalence class(REC) is proposed based on the smallest computational granularity of global equivalences, and the character of REC is analyzed, under which core and reduction computation are proved to be the same with those in the original decision system. Then the relationship between positive region and the 3 types of RECs are studied, and an incremental equal method of positive region based on bilateral deleting of 1-REC and ??1-REC is designed. Two directional pruning strategies and the incremental attribute partitioning algorithm with multiple Hashing are designed, based on which the efficient and complete attribution reduction algorithm is proposed. Finally, 20 decision sets of UCI, massive and ultra-high dimension data sets are used to verify the algorithms, and the results show that the attribution reduction algorithm proposed is efficient and superior to current algorithms in most conditions, and is fit for massive and ultra-high dimensional decision tables especially.

    参考文献
    相似文献
    引证文献
引用本文

赵洁 张恺航 董振宁.粗等价类双边剪枝策略下多次Hash 的约简算法[J].控制与决策,2016,31(11):1921-1935

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2015-09-18
  • 最后修改日期:2015-12-17
  • 录用日期:
  • 在线发布日期: 2016-11-20
  • 出版日期:
文章二维码