基于近红外相机和激光雷达的商业鸡舍移动机器人自主导航研究
CSTR:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:

山东省重点研发计划项目(2022TZXD0015)


Autonomous Navigation Method for Mobile Robots in Commercial Chicken Farming Houses Based on Near-infrared Camera and LiDAR
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    为满足畜禽舍内巡检与对畜禽精准操作的需求,对自主移动机器人导航方法提出了更高要求。然而,现有导航方法存在对光源敏感、环境干扰耐受性低以及成本高等问题。为此,提出多模态传感器融合框架,通过联合优化激光雷达几何特征与近红外相机语义地标信息,克服复杂环境下导航瓶颈。以激光雷达大范围稳健定位能力为基础,近红外相机作为高精度辅助,实现高效导航。导航系统通过近红外图像识别并提取料槽肋板作为地标,将地标信息与轮式里程计、激光雷达和惯性测量单元(IMU)数据融合,采用Cartographer算法构建环境地图,实现高精度定位。以目标位点位姿信息支持全局路径规划,局部路径规划则结合Trajectory Rollout和Dynamic Window Approaches算法,进一步保障系统自主定位能力与导航精度。在叠层笼养鸡舍进行了实地测试,结果显示地标位置精度偏差标准差小于2 cm;大厅中,当速度小于0.5 m/s时,单个位点位置偏差标准差小于2 cm,偏航角偏差标准差小于1°;过道中,当速度小于0.3 m/s时,目标位点位置偏差标准差不大于5.04 cm,偏航角偏差标准差约为1°。实地测试结果表明,系统在复杂畜禽舍环境中实现了厘米级定位精度,适用于畜禽舍巡检与精准操作任务。

    Abstract:

    The increasing demand for inspection and precise operations in livestock housing has imposed high requirements on navigation methods for autonomous mobile robots. However, existing navigation methods are often sensitive to lighting conditions, susceptible to environmental disturbances, and expensive, limiting their applicability. To address this challenge, a multimodal sensor fusion framework for robotic navigation was proposed which fused a near-infrared camera and a 2D LiDAR. By jointly optimizing the geometric features of LiDAR and the semantic landmark information from the near-infrared camera, the framework overcame navigation bottlenecks in complex environments. The system leveraged the wide-range robust localization capability of LiDAR and the high-precision assistance of the near-infrared camera to achieve efficient navigation. The system leveraged LiDAR to provide robust localization in large-scale environments and utilized a near-infrared camera to enhance navigation accuracy. By identifying and extracting feeder rib landmarks from near-infrared images, the system fused this landmark information with data from wheel odometry, LiDAR, and an inertial measurement unit (IMU). The Cartographer algorithm was employed to construct environmental maps, enabling high-precision localization. The pose information of target locations was systematically recorded to support global path planning, while local path planning incorporated the Trajectory Rollout and Dynamic Window Approaches algorithms to further ensure autonomous navigation precision. Field tests were conducted in stacked cage poultry houses. Results demonstrated that the standard deviation of landmark positioning accuracy was less than 2 cm. In the hall, at speeds below 0.5 m/s, the standard deviation of single-point positioning error was less than 2 cm, and the standard deviation of yaw angle error was less than 1°. In the aisles, at speeds below 0.3 m/s, the standard deviation of target point positioning error was no more than 5.04 cm, and the standard deviation of yaw angle error was approximately 1°. Field test results demonstrated that the system achieved centimeter-level positioning accuracy in complex livestock housing environments, validating its applicability for inspection and precision operation tasks.

    参考文献
    相似文献
    引证文献
引用本文

王粮局,姜丰,张亚磊,姜伟,孟安祺,王红英,滕光辉,童勤.基于近红外相机和激光雷达的商业鸡舍移动机器人自主导航研究[J].农业机械学报,2026,57(8):375-385. WANG Liangju, JIANG Feng, ZHANG Yalei, JIANG Wei, MENG Anqi, WANG Hongying, TENG Guanghui, TONG Qin. Autonomous Navigation Method for Mobile Robots in Commercial Chicken Farming Houses Based on Near-infrared Camera and LiDAR[J]. Transactions of the Chinese Society for Agricultural Machinery,2026,57(8):375-385.

复制
分享
相关视频

文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2025-01-15
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2026-04-15
  • 出版日期:
文章二维码