Ensuring neural network robustness is essential for the safe and reliable operation of robotic learning systems, especially in perception and decision-making tasks within real-world environments. This paper investigates the robustness of neural networks in perception systems, specifically examining their sensitivity to targeted, small-scale perturbations. We identify the Lipschitz constant as a key metric for quantifying and enhancing network robustness. We derive an analytical expression to compute the Lipschitz constant based on neural network architecture, providing a theoretical basis for estimating and improving robustness. Several experiments reveal the relationship between network design, the Lipschitz constant, and robustness, offering practical insights for developing safer, more robust robot learning systems.
翻译:确保神经网络的鲁棒性对于机器人学习系统在现实世界环境中安全可靠地运行至关重要,尤其是在感知与决策任务中。本文研究了感知系统中神经网络的鲁棒性,特别考察了其对有针对性、小尺度扰动的敏感性。我们确定Lipschitz常数作为量化与增强网络鲁棒性的关键度量指标。我们推导了一个基于神经网络架构计算Lipschitz常数的解析表达式,为估计与提升鲁棒性提供了理论基础。多项实验揭示了网络设计、Lipschitz常数与鲁棒性之间的关系,为开发更安全、更鲁棒的机器人学习系统提供了实践洞见。