Content moderation practices and technologies need to change over time as requirements and community expectations shift. However, attempts to restructure existing moderation practices can be difficult, especially for platforms that rely on their communities to conduct moderation activities, because changes can transform the workflow and workload of moderators and contributors' reward systems. Through the study of extensive archival discussions around a prepublication moderation technology on Wikipedia named Flagged Revisions, complemented by seven semi-structured interviews, we identify various challenges in restructuring community-based moderation practices. We learn that while a new system might sound good in theory and perform well in terms of quantitative metrics, it may conflict with existing social norms. Our findings also highlight how the intricate relationship between platforms and self-governed communities can hinder the ability to assess the performance of any new system and introduce considerable costs related to maintaining, overhauling, or scrapping any piece of infrastructure.
翻译:随着需求与社区期望的演变,内容审核的实践与技术亦需随之调整。然而,重构现有审核机制的努力往往面临重重困难,尤其对于依赖社区开展审核活动的平台而言,因为变革可能深刻改变审核人员的工作流程与负担,以及贡献者的奖励体系。本研究通过对维基百科上一项名为“已标记修订”的预发布审核技术所引发的大量历史讨论进行档案分析,并结合七次半结构化访谈,识别出重构社区审核实践中的多种挑战。我们发现,尽管新系统在理论上可能颇具吸引力,且在量化指标上表现优异,却可能与既有的社会规范产生冲突。研究结果进一步揭示,平台与自治社区之间错综复杂的关系,不仅会阻碍对新系统性能的客观评估,还会在维护、彻底改造或废弃任何基础设施组件时带来显著成本。