Online communities serve as essential support channels for People Who Use Drugs (PWUD), providing access to peer support and harm reduction information. The moderation of these communities involves consequential decisions affecting member safety, yet existing sociotechnical systems provide insufficient support for moderators. Through interviews with experienced moderators from PWUD forums on Reddit, we examine the unique nature of this work and its implications for HCI and content moderation research. We demonstrate that this work constitutes a distinct form of public health intervention characterised by three challenges: (1) high-stakes risk evaluation requiring pharmacological expertise, (2) time-critical crisis intervention spanning platform content and external drug market surveillance, and (3) navigation of structural conflicts where platform policies designed to minimise legal liability directly oppose community harm reduction goals. Our findings extend existing HCI moderation frameworks by revealing how legal liability structures can systematically undermine expert moderators' protective work, with implications for other marginalised communities facing similar regulatory tensions, including abortion care and sex work contexts. We identify two necessary shifts in sociotechnical design: moving from binary classification to multi-dimensional approaches that externalise competing factors moderators must balance, and shifting from low-level rule programming to high-level example-based instruction. However, we surface unresolved tensions around volunteer labour sustainability and risks of incorporating automated systems in high-stakes health contexts, identifying open questions requiring HCI research attention. These findings inform the design of platforms that better accommodate vulnerable populations whose health needs conflict with regulatory frameworks.
翻译:在线社区为药物使用者提供了至关重要的支持渠道,使他们能够获得同伴支持和减害信息。这些社区的审核工作涉及影响成员安全的关键决策,然而现有的社会技术系统未能为审核员提供充分支持。通过对Reddit平台药物使用者论坛经验丰富的审核员进行访谈,本研究探讨了此项工作的独特性质及其对人机交互与内容审核研究的意义。我们证明这项工作构成了一种独特的公共卫生干预形式,具有三大特征性挑战:(1) 需要药理学专业知识的高风险评估;(2) 跨越平台内容与外部毒品市场监测的时效性危机干预;(3) 结构性冲突的应对——平台为降低法律责任设计的政策往往与社区减害目标直接冲突。我们的研究发现拓展了现有人机交互审核框架,揭示了法律责任结构如何系统性地削弱专业审核员的保护性工作,这对面临类似监管困境的其他边缘化群体(包括堕胎护理和性工作领域)具有重要启示。我们提出社会技术设计需要两个关键转变:从二元分类转向多维方法以外部化审核员必须权衡的竞争性因素,以及从低层次规则编程转向基于高层次示例的指导。然而,我们也揭示了围绕志愿者劳动可持续性以及在高风险健康环境中引入自动化系统所存在的未解矛盾,指出了需要人机交互研究关注的核心问题。这些发现为平台设计提供了重要参考,有助于更好地服务那些健康需求与监管框架存在冲突的脆弱群体。