Recently, biped robot walking technology has been significantly developed, mainly in the context of a bland walking scheme. To emulate human walking, robots need to step on the positions they see in unknown spaces accurately. In this paper, we present PolyMap, a perception-based locomotion planning framework for humanoid robots to climb stairs. Our core idea is to build a real-time polygonal staircase plane semantic map, followed by a footstep planar using these polygonal plane segments. These plane segmentation and visual odometry are done by multi-sensor fusion(LiDAR, RGB-D camera and IMUs). The proposed framework is deployed on a NVIDIA Orin, which performs 20-30 Hz whole-body motion planning output. Both indoor and outdoor real-scene experiments indicate that our method is efficient and robust for humanoid robot stair climbing.
翻译:近年来,双足机器人行走技术取得了显著进展,但主要局限于平坦行走场景。为了模拟人类行走,机器人需要精确地踩踏在未知空间中观测到的位置。本文提出PolyMap,一种基于感知的人形机器人楼梯攀爬运动规划框架。我们的核心思想是构建实时多边形楼梯平面语义地图,随后利用这些多边形平面段进行落脚点平面规划。平面分割与视觉里程计通过多传感器融合(LiDAR、RGB-D相机与IMU)实现。该框架部署于NVIDIA Orin平台,可实现20-30Hz全身运动规划输出。室内外真实场景实验表明,本方法对人形机器人楼梯攀爬任务具有高效性与鲁棒性。