YOLOv5预测边界框分簇自适应损失权重改进模型
CSTR:
作者:
作者单位:

江西财经大学 软件与物联网工程学院,南昌 330013

作者简介:

通讯作者:

E-mail: niepeng@jxufe.edu.cn.

中图分类号:

TP301

基金项目:

国家自然科学基金项目(61866014).


Enhanced self-adaptive loss weight YOLOv5 model based on predicted bounding boxes in clusters
Author:
Affiliation:

School of Software and Internet of Things Engineering,Jiangxi University of Finance and Economics,Nanchang 330013,China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    目标检测的精确程度是计算机视觉识别任务的主要影响因素.针对单阶段目标检测模型YOLOv5存在的检测精度问题,从多任务损失优化角度,提出一种在不同分辨率特征图上基于同一目标的预测边界框分簇自适应损失权重改进模型.该模型由GT(ground true)目标边界框UID分配器、GT目标边界框UID匹配器、边界框位置及分类损失权重算法构成,通过改善YOLOv5的位置精度和分类精度实现模型整体精度的提升.实验结果表明,改进模型的平均精度均值(mean average precision,mAP)较YOLOv5.6标准模型相对提升5.23%;相较于更为复杂的YOLOv5x6标准模型,改进模型mAP取得8.02%的相对提升.

    Abstract:

    Object detection precision plays a critical role in computer vision tasks. Aiming at the precision problem available in the one-stage object detection model of the YOLOv5, this paper proposes an enhanced self-adaptive loss weight YOLOv5 model based on predicted bounding boxes in clusters presenting the individual targets in multi-resolution feature maps to optimize the multi-task loss. The enhanced model consists of the GT(ground true) target bounding box UID distributor, the GT target bounding box UID matcher, the bounding box position loss weight algorithm and the classification loss weight algorithm. The overall detection precision is improved by the enhancements of both position precision and classification precision in the YOLOv5. The experimental results present that compared with the YOLOv5.6, the mean average precision(mAP) is promoted relatively by 5.23% on average by the enhanced model which achieves the relative performance of 8.02% compared with the more complex model of YOLOv5x6.

    参考文献
    相似文献
    引证文献
引用本文

聂鹏,肖欢,喻聪. YOLOv5预测边界框分簇自适应损失权重改进模型[J].控制与决策,2023,38(3):645-653

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2023-02-17
  • 出版日期: 2023-03-20
文章二维码