NASA's forthcoming Lunar Gateway space station, which will be uncrewed most of the time, will need to operate with an unprecedented level of autonomy. Enhancing autonomy on the Gateway presents several unique challenges, one of which is to equip the Canadarm3, the Gateway's external robotic system, with the capability to perform worksite monitoring. Monitoring will involve using the arm's inspection cameras to detect any anomalies within the operating environment, a task complicated by the widely-varying lighting conditions in space. In this paper, we introduce the visual anomaly detection and localization task for space applications and establish a benchmark with our novel synthetic dataset called ALLO (for Anomaly Localization in Lunar Orbit). We develop a complete data generation pipeline to create ALLO, which we use to evaluate the performance of state-of-the-art visual anomaly detection algorithms. Given the low tolerance for risk during space operations and the lack of relevant data, we emphasize the need for novel, robust, and accurate anomaly detection methods to handle the challenging visual conditions found in lunar orbit and beyond.
翻译:NASA即将建成的月球门户空间站在大部分时间将处于无人值守状态,因此需要以前所未有的自主水平运行。提升门户站的自主性面临若干独特挑战,其中之一是为该站的舱外机器人系统Canadarm3配备执行工作现场监测的能力。监测任务将涉及利用机械臂的检测摄像头识别操作环境中的异常情况,而太空环境中剧烈变化的光照条件使该任务变得尤为复杂。本文针对空间应用提出了视觉异常检测与定位任务,并通过我们新颖的合成数据集ALLO(月球轨道异常定位数据集)建立了基准。我们开发了完整的数据生成流程来创建ALLO数据集,并利用该数据集评估了当前最先进的视觉异常检测算法性能。考虑到太空任务对风险的低容忍度及相关数据的缺乏,我们强调需要开发新颖、鲁棒且精确的异常检测方法,以应对月球轨道及更远太空环境中极具挑战性的视觉条件。