Vision-based object tracking is an essential precursor to performing autonomous aerial navigation in order to avoid obstacles. Biologically inspired neuromorphic event cameras are emerging as a powerful alternative to frame-based cameras, due to their ability to asynchronously detect varying intensities (even in poor lighting conditions), high dynamic range, and robustness to motion blur. Spiking neural networks (SNNs) have gained traction for processing events asynchronously in an energy-efficient manner. On the other hand, physics-based artificial intelligence (AI) has gained prominence recently, as they enable embedding system knowledge via physical modeling inside traditional analog neural networks (ANNs). In this letter, we present an event-based physics-guided neuromorphic planner (EV-Planner) to perform obstacle avoidance using neuromorphic event cameras and physics-based AI. We consider the task of autonomous drone navigation where the mission is to detect moving gates and fly through them while avoiding a collision. We use event cameras to perform object detection using a shallow spiking neural network in an unsupervised fashion. Utilizing the physical equations of the brushless DC motors present in the drone rotors, we train a lightweight energy-aware physics-guided neural network (PgNN) with depth inputs. This predicts the optimal flight time responsible for generating near-minimum energy paths. We spawn the drone in the Gazebo simulator and implement a sensor-fused vision-to-planning neuro-symbolic framework using Robot Operating System (ROS). Simulation results for safe collision-free flight trajectories are presented with performance analysis, ablation study and potential future research directions
翻译:基于视觉的目标跟踪是实现自主空中导航以避免障碍的重要前提。受生物启发的神经形态事件相机因其能够异步检测强度变化(即使在弱光条件下)、高动态范围以及对运动模糊的鲁棒性,正逐渐成为基于帧相机的强大替代方案。脉冲神经网络因其能够以节能方式异步处理事件而受到关注。另一方面,基于物理的人工智能近年来日益突出,它通过物理建模将系统知识嵌入传统模拟神经网络中。本文提出一种基于事件驱动与物理引导的神经形态规划器(EV-Planner),利用神经形态事件相机和基于物理的人工智能实现避障。我们考虑自主无人机导航任务,目标是检测移动门并穿越它们同时避免碰撞。我们使用事件相机,通过浅层脉冲神经网络以无监督方式执行目标检测。利用无人机旋翼中无刷直流电机的物理方程,我们训练了一个轻量级的能量感知物理引导神经网络,该网络输入深度信息,用于预测实现近最小能量路径的最优飞行时间。我们在Gazebo仿真器中生成无人机,并使用机器人操作系统实现了一个传感器融合的视觉-规划神经符号框架。展示了安全无碰撞飞行轨迹的仿真结果,并进行了性能分析、消融研究以及潜在未来研究方向的探讨。