This paper presents a motion-coupled mapping algorithm for contour mapping of hybrid rice canopies, specifically designed for Agricultural Unmanned Ground Vehicles (Agri-UGV) navigating complex and unknown rice fields. Precise canopy mapping is essential for Agri-UGVs to plan efficient routes and avoid protected zones. The motion control of Agri-UGVs, tasked with impurity removal and other operations, depends heavily on accurate estimation of rice canopy height and structure. To achieve this, the proposed algorithm integrates real-time RGB-D sensor data with kinematic and inertial measurements, enabling efficient mapping and proprioceptive localization. The algorithm produces grid-based elevation maps that reflect the probabilistic distribution of canopy contours, accounting for motion-induced uncertainties. It is implemented on a high-clearance Agri-UGV platform and tested in various environments, including both controlled and dynamic rice field settings. This approach significantly enhances the mapping accuracy and operational reliability of Agri-UGVs, contributing to more efficient autonomous agricultural operations.
翻译:本文提出了一种用于杂交水稻冠层轮廓测绘的运动耦合映射算法,专为在复杂未知稻田环境中作业的农业无人地面车辆(Agri-UGV)设计。精确的冠层测绘对于Agri-UGV规划高效路径和避开保护区至关重要。承担去杂等作业任务的Agri-UGV,其运动控制高度依赖于对水稻冠层高度与结构的准确估计。为实现这一目标,所提算法将实时RGB-D传感器数据与运动学及惯性测量信息相融合,实现了高效建图与本体感知定位。该算法生成基于网格的高程地图,能够反映冠层轮廓的概率分布,并考虑了运动引起的不确定性。算法在一个高离地间隙的Agri-UGV平台上实现,并在多种环境(包括受控和动态的稻田场景)中进行了测试。该方法显著提升了Agri-UGV的测绘精度与作业可靠性,有助于实现更高效的自主农业作业。