基于无人机遥感图像的水稻稻穗识别与估产方法研究
CSTR:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:

农业农村部科技项目和高等学校学科创新引智计划项目(D18019)


Rice Panicle Recognition and Yield Estimation Based on UAV Remote Sensing Images
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对水稻稻穗识别中跨生育期精度低、复杂田间环境易漏检等问题,本文提出一种基于改进YOLO 11的旋转目标检测模型AHF-YOLO 11。该模型通过在YOLO 11主干网络中引入卷积视觉模块AssemFormer来构建C3k2_AssemFormer特征提取模块,显著增强密集遮挡场景下的局部特征表达与跨生育期全局形态学习能力;进一步采用高级筛选功能金字塔网络HSFPN替代原模型颈部结构,有效提升模型对不同背景下多尺度稻穗的识别能力。试验表明,改进模型AHF-YOLO 11的准确率达到90.7%,较原YOLO 11模型提升4.3个百分点,相对于主流模型分别提高4.4~39个百分点。跨生育期测试试验结果表明,AHF-YOLO 11模型在孕穗、抽穗、灌浆、成熟期的识别准确率比原模型分别提高11.88、7.3、4.78、4.3个百分点,并在抽穗期的检测精度最高,达到94.41%。进一步跟踪研究和估产试验表明,抽穗末期-灌浆初期期间为最佳估产时间,估产误差最低仅为4.66%。本研究为水稻穗数识别和估产提供了重要技术支撑。

    Abstract:

    In order to address the problems of low detection accuracy across different rice growth stages and easily missed detection under complex field environments, an improved rotated object detection model AHF-YOLO 11 was proposed based on YOLO 11. This model introduced convolutional vision module AssemFormer into YOLO 11 backbone network to construct C3k2_AssemFormer feature extraction module, significantly enhancing the local feature expression and cross reproductive global morphology learning ability in dense occlusion scenes. Furthermore, it adopted the high-level screening feature pyramid network (HSFPN) to replace the original neck structure, effectively improving the ability of the model to recognize multi-scale rice panicles under varying backgrounds. Experimental results demonstrated that the accuracy of the improved AHF-YOLO 11 model reached 90.7%, which represented an improvement of 4.3 percentage points higher than that of the original YOLO 11 model and 4.4~39 percentage points higher than the mainstream models, respectively. The results of cross-growth-stage testing further revealed that the recognition accuracy of AHF-YOLO 11 in the booting, heading, filling, and maturity stages was respectively improved by 11.88, 7.3, 4.78, and 4.3 percentage points when compared with that of the original model. Among these, the highest accuracy of 94.41% was achieved for the model at the heading stage. Further follow-up tracking studies and yield estimation experiments indicated that the period from the late heading stage to the early filling stage was the optimal period for rice yield estimation, with the lowest yield estimation error of only 4.66%. The research result can provide important technical support for rice panicle number recognition and yield estimation.

    参考文献
    相似文献
    引证文献
引用本文

龙拥兵,李坤,刘丹丹,龙腾,潘朝阳,陈文瀚,贺敏,郭炜伦,李俊杰,陈荣坤,康辉,兰玉彬.基于无人机遥感图像的水稻稻穗识别与估产方法研究[J].农业机械学报,2026,57(3):109-118. LONG Yongbing, LI Kun, LIU Dandan, LONG Teng, PAN Chaoyang, CHEN Wenhan, HE Min, GUO Weilun, LI Junjie, CHEN Rongkun, KANG Hui, LAN Yubin. Rice Panicle Recognition and Yield Estimation Based on UAV Remote Sensing Images[J]. Transactions of the Chinese Society for Agricultural Machinery,2026,57(3):109-118.

复制
分享
相关视频

文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2025-09-28
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2026-02-01
  • 出版日期:
文章二维码