Personal moderation tools on social media platforms allow users to control their feeds by configuring the acceptable toxicity thresholds for their feed content or muting inappropriate accounts. This research examines how the end-user configuration of these tools is shaped by four critical psychosocial factors - fear of missing out (FoMO), social media addiction, subjective norms, and trust in moderation systems. Findings from a nationally representative sample of 1,061 participants show that FoMO and social media addiction make Facebook users more vulnerable to content-based harms by reducing their likelihood of adopting personal moderation tools to hide inappropriate posts. In contrast, descriptive and injunctive norms positively influence the use of these tools. Further, trust in Facebook's moderation systems also significantly affects users' engagement with personal moderation. This analysis highlights qualitatively different pathways through which FoMO and social media addiction make affected users disproportionately unsafe and offers design and policy solutions to address this challenge.
翻译:社交媒体平台上的个人审核工具允许用户通过配置可接受的内容毒性阈值或屏蔽不当账户来控制其信息流。本研究考察了四种关键心理社会因素——错失恐惧(FoMO)、社交媒体成瘾、主观规范及对审核系统的信任——如何影响用户对这些工具的配置选择。基于全国代表性样本(N=1,061)的调查结果显示,错失恐惧与社交媒体成瘾会降低用户采用个人审核工具隐藏不当帖子的可能性,从而增加Facebook用户受到内容伤害的风险。相反,描述性规范与指令性规范则对这类工具的使用产生正向影响。此外,用户对Facebook审核系统的信任度也会显著影响其个人审核参与行为。本分析揭示了错失恐惧与社交媒体成瘾通过不同定性路径导致受影响用户面临更高安全风险的机制,并提出了应对这一挑战的设计策略与政策建议。