Critical systems, such as those used in healthcare, defence, and disaster management, demand rigorous requirements engineering to ensure safety and reliability. Yet, much of this rigour has traditionally focused on technical assurance, often overlooking the human and social contexts in which these systems operate. This paper argues that considering human-centric aspects is an essential dimension of dependability, and presents a human-centred RE process designed to integrate social responsibility into critical system development. Drawing from a literature review, we identified a set of guidelines for designing software for vulnerable communities and translated these into sixty-two functional and non-functional requirements. These requirements were operationalised through the design of an adaptive early warning system prototype, which was subsequently evaluated through six interviews and eight cognitive walkthroughs to validate their relevance and applicability. The findings demonstrate that human-centric requirements, when addressed early, enhance the usability and accessibility of systems for all users. The paper concludes by positioning human-centricity not as an ethical add-on but as a defining quality of safe and equitable critical systems.
翻译:关键系统(如医疗、国防和灾害管理领域所使用的系统)需要严格的需求工程以确保安全性和可靠性。然而,传统的严谨性多集中于技术保障层面,往往忽视了系统运行所涉及的人与社会背景。本文主张,考量人本因素是系统可信赖性的核心维度,并提出一种人本需求工程流程,旨在将社会责任融入关键系统开发中。通过文献综述,我们识别出一套面向脆弱群体软件设计的指导原则,并将其转化为六十二项功能性与非功能性需求。这些需求通过一个自适应预警系统原型的设计得以实施,随后通过六次访谈和八次认知走查进行评估,以验证其相关性与适用性。研究结果表明,早期纳入人本需求能提升系统对所有用户的可用性与可及性。本文最终强调,人本导向不应仅是伦理附加项,而应成为安全、公平的关键系统的本质属性。