A fundamental challenge in robot perception is the coupling of the sensor pose and robot pose. This has led to research in active vision where robot pose is changed to reorient the sensor to areas of interest for perception. Further, egomotion such as jitter, and external effects such as wind and others affect perception requiring additional effort in software such as image stabilization. This effect is particularly pronounced in micro-air vehicles and micro-robots who typically are lighter and subject to larger jitter but do not have the computational capability to perform stabilization in real-time. We present a novel microelectromechanical (MEMS) mirror LiDAR system to change the field of view of the LiDAR independent of the robot motion. Our design has the potential for use on small, low-power systems where the expensive components of the LiDAR can be placed external to the small robot. We show the utility of our approach in simulation and on prototype hardware mounted on a UAV. We believe that this LiDAR and its compact movable scanning design provide mechanisms to decouple robot and sensor geometry allowing us to simplify robot perception. We also demonstrate examples of motion compensation using IMU and external odometry feedback in hardware.
翻译:机器人感知中的一个基本挑战是传感器姿态与机器人姿态的耦合。这一难题催生了主动视觉领域的研究,即通过改变机器人姿态重新定向传感器至感兴趣区域以完成感知任务。此外,抖动等自运动以及风力等外部因素会影响感知,需要图像稳定等额外软件处理。这种效应在微型飞行器和微型机器人中尤为显著——它们通常质量更轻、抖动幅度更大,却缺乏实时稳定计算的运算能力。本文提出一种基于微机电系统(MEMS)振镜的新型激光雷达系统,能够独立于机器人运动改变激光雷达视场角。该设计可应用于小型低功耗系统,将激光雷达中的昂贵组件置于小型机器人外部。我们通过仿真实验和搭载于无人机上的原型硬件展示了该方法的有效性。研究认为,这种紧凑型可动扫描激光雷达设计为解耦机器人与传感器几何关系提供了可行方案,从而简化机器人感知过程。同时,本文也演示了利用IMU和外部里程计反馈进行硬件级运动补偿的实例。