融合改进YOLO v5s与毫米波雷达的避障目标检测方法
CSTR:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:

山东省重点研发计划项目(2022SFGC0202)、广州市基础与应用基础研究专题青年博士“启航”项目(SL2023A04J00077)和湖南省科技厅重大项目(2023NK1020)


Improved Obstacle Avoidance Target Detection with YOLO v5s and Millimeter Wave Radar Fusion
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    为提高农场环境下无人驾驶农机对农田障碍物感知的准确性,针对视觉检测容易受光照和毫米波雷达检测易受车辆颠簸等,以及视觉检测目标算法在复杂田间下参数量大、计算量大、模型体积大的问题,本文提出一种融合视觉与毫米波雷达信息的无人驾驶农机避障目标检测方法。首先过滤毫米波雷达的部分目标数据,并提出基于自适应扩展卡尔曼滤波的目标追踪算法。然后制作农场环境障碍物数据集,构建基于改进YOLO v5s的目标检测模型。随后,通过时间戳对齐和直接线性标定法的坐标变换,实现雷达点向图像像素坐标系的映射。最后,通过决策级融合方法和目标匹配策略构建毫米波雷达与视觉传感器的障碍物检测信息融合模型。试验结果表明,改进YOLO v5s模型平均精度均值为97.0%,与原始模型相近,但参数量、计算量、模型内存占用量仅分别为原YOLO v5s模型的40.2%、39.2%和38.2%,与YOLO v4-Tiny、YOLO v7-Tiny、YOLO v4和YOLO v7模型相比能够更好地平衡精确率与检测速度。多场景试验结果表明,本文提出的融合方法在白天试验时相较于雷达与视觉识别准确率分别提高2.67、15.07个百分点,夜间试验时融合检测方法能有效弥补视觉失效的情况,比单传感器算法具有更好的鲁棒性与准确性,同时基于融合检测方法有效实现了无人驾驶农机停车避障。

    Abstract:

    In order to improve the accuracy of unmanned agricultural machine’s perception of obstacles in the farm environment, to solve the problem that visual detection is easily affected by light and millimeter-wave radar detection is easily affected by vehicle bumps, etc., as well as the problem that the visual target detection algorithm has a large number of parameters, a large amount of computation, and a large volume of the model under the complex field, this paper proposes an obstacle-avoidance target detection method for unmanned agricultural machines with the fusion of visual and millimeter-wave radar information. Part of the radar target data from millimeter-wave radar is first filtered, and a target tracking algorithm based on adaptive extended Kalman filtering is proposed. Then a farm environment obstacle dataset is produced and a target detection network based on improved YOLO v5s is constructed. Subsequently, the mapping of radar points into the image pixel coordinate system is realized by time-stamp alignment and coordinate transformation with direct linear calibration method. Finally, the fusion model of obstacle detection information from millimeter-wave radar and visual sensors is constructed through the decision-level fusion method and target matching strategy, and the experimental results show that the mean average accuracy of the improved YOLO v5s is 97.0%, which is similar to that of the original model, but the number of parameters, the amount of computation, and the size of the model are only 40.2%, 39.2%, and 38.2% of the original YOLO v5s model, respectively. Compared with YOLO v4-Tiny, YOLO v7-Tiny, YOLO v4 and YOLO v7 models, it can better balance the detection accuracy and speed. The results of multi-scene tests show that the fusion method proposed in this paper improves the recognition accuracy by 2.67 percentage points and 15.07 percentage points compared with radar and camera in daytime tests, and the fusion detection method can effectively make up for the failure of the camera in nighttime tests, and has better robustness and accuracy than the single-sensor algorithm, and the fusion obstacle avoidance system effectively realizes the parking avoidance of the unmanned agricultural machine.

    参考文献
    相似文献
    引证文献
引用本文

胡炼,梁楚奇,罗雅玲,柳诏迪,阮庆强,汪沛,黄培奎,王培,孙宜田.融合改进YOLO v5s与毫米波雷达的避障目标检测方法[J].农业机械学报,2025,56(12):634-644. HU Lian, LIANG Chuqi, LUO Yaling, LIU Zhaodi, RUAN Qingqiang, WANG Pei, HUANG Peikui, WANG Pei, SUN Yitian. Improved Obstacle Avoidance Target Detection with YOLO v5s and Millimeter Wave Radar Fusion[J]. Transactions of the Chinese Society for Agricultural Machinery,2025,56(12):634-644.

复制
分享
相关视频

文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2025-03-12
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2025-12-10
  • 出版日期:
文章二维码