Abstract:The increasing demand for inspection and precise operations in livestock housing has imposed high requirements on navigation methods for autonomous mobile robots. However, existing navigation methods are often sensitive to lighting conditions, susceptible to environmental disturbances, and expensive, limiting their applicability. To address this challenge, a multimodal sensor fusion framework for robotic navigation was proposed which fused a near-infrared camera and a 2D LiDAR. By jointly optimizing the geometric features of LiDAR and the semantic landmark information from the near-infrared camera, the framework overcame navigation bottlenecks in complex environments. The system leveraged the wide-range robust localization capability of LiDAR and the high-precision assistance of the near-infrared camera to achieve efficient navigation. The system leveraged LiDAR to provide robust localization in large-scale environments and utilized a near-infrared camera to enhance navigation accuracy. By identifying and extracting feeder rib landmarks from near-infrared images, the system fused this landmark information with data from wheel odometry, LiDAR, and an inertial measurement unit (IMU). The Cartographer algorithm was employed to construct environmental maps, enabling high-precision localization. The pose information of target locations was systematically recorded to support global path planning, while local path planning incorporated the Trajectory Rollout and Dynamic Window Approaches algorithms to further ensure autonomous navigation precision. Field tests were conducted in stacked cage poultry houses. Results demonstrated that the standard deviation of landmark positioning accuracy was less than 2 cm. In the hall, at speeds below 0.5 m/s, the standard deviation of single-point positioning error was less than 2 cm, and the standard deviation of yaw angle error was less than 1°. In the aisles, at speeds below 0.3 m/s, the standard deviation of target point positioning error was no more than 5.04 cm, and the standard deviation of yaw angle error was approximately 1°. Field test results demonstrated that the system achieved centimeter-level positioning accuracy in complex livestock housing environments, validating its applicability for inspection and precision operation tasks.