基于多元时间序列的自适应贪婪高斯分段算法
作者:
作者单位:

北京科技大学自动化学院

作者简介:

通讯作者:

中图分类号:

TP273

基金项目:


Adaptive greedy gaussian segmentation algorithm based on multivariate time series
Author:
Affiliation:

The University of Science and Technology Beijing

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    现有多元时间序列分段算法中分段点选择和分段个数确定往往需要分别独立完成, 大大增加了算法的计算复杂度, 为了解决上述问题, 本文提出了一种基于多元时间序列的自适应贪婪高斯分段算法. 该算法将多元时间序列分段上的数据解释为来自高斯分布的独立样本, 将分段问题转化为高斯分段模型的参数求解问题. 为了提高学习效率, 采用贪婪搜寻方法使每个段的似然值最大化进而近似地找到最优段点, 并且在搜寻的过程中利用信息增益方法自适应地获取最优的分段个数, 避免了分段个数确定和分段点选择分别独立地进行, 从而减少了计算的复杂度.实验结果表明,本文提出的方法在不同数据集上都得到了有效验证.

    Abstract:

    For most multivariate time series segmentation algorithms, the selection of segmentation points and the determination of the number of segments often need to be completed independently, which greatly increase the computational complexity of the algorithm. In order to solve the above problems, an adaptive greedy Gaussian segmentation algorithm based on multivariate time series is proposed in this paper. The algorithm interprets the data on the segmentation of multivariate time series as independent samples from Gaussian distribution, and the segmentation problem is transformed into the parameter solution problem of Gaussian segmentation model. In order to improve the learning efficiency, the greedy search method is adopted to maximize the likelihood value of each segment to find the optimal segment point approximately. During the searching process, the information gain method is adopted to adaptively obtain the optimal number of segments, which avoids from realizing the determination of the number of segments and the selection of segment points independently to reduce the computational complexity. Experimental results show that our algorithm has been effectively verified on different data sets .

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2022-05-02
  • 最后修改日期:2023-03-27
  • 录用日期:2022-10-10
  • 在线发布日期: 2022-10-22
  • 出版日期: