Open access to publication, software and hardware is central to robotics: it lowers barriers to entry, supports reproducible science and accelerates reliable system development. However, openness also exacerbates the inherent dual-use risks associated with research and innovation in robotics. It lowers barriers for states and non-state actors to develop and deploy robotics systems for military use and harmful purposes. Compared to other fields of engineering where dual-use risks are present - e.g., those that underlie the development of weapons of mass destruction (chemical, biological, radiological, and nuclear weapons) and even the field of AI, robotics offers no specific regulation and little guidance as to how research and innovation may be conducted and disseminated responsibly. While other fields can be used for guidance, robotics has its own needs and specificities which have to be taken into account. The robotics community should therefore work toward its own set of sector-specific guidance and possibly regulation. To that end, we propose a roadmap focusing on four practices: a) education in responsible robotics; b) incentivizing risk assessment; c) moderating the diffusion of high-risk material; and d) developing red lines.
翻译:开放获取出版物、软件和硬件是机器人技术的核心:它降低了准入门槛,支持可重复的科学,并加速了可靠系统的开发。然而,开放性也加剧了与机器人研究和创新相关的固有双重用途风险。它降低了国家和非国家行为者开发和部署机器人系统用于军事用途和有害目的的门槛。与其他存在双重用途风险的工程领域相比——例如,那些支撑大规模杀伤性武器(化学、生物、放射性和核武器)发展的领域,甚至人工智能领域,机器人技术缺乏具体的监管,关于如何负责任地进行和传播研究与创新的指导也很少。虽然可以借鉴其他领域的指导,但机器人技术有其自身的需求和特殊性,必须加以考虑。因此,机器人技术界应致力于制定一套针对本领域的专门指导方针,并可能包括监管。为此,我们提出了一个路线图,重点关注四项实践:a) 负责任机器人技术教育;b) 激励风险评估;c) 管控高风险材料的传播;以及 d) 制定红线。