This study presents the first large-scale comparison of persuasion techniques present in crowd- versus professionally-written debunks. Using extensive datasets from Community Notes (CNs), EUvsDisinfo, and the Database of Known Fakes (DBKF), we quantify the prevalence and types of persuasion techniques across these fact-checking ecosystems. Contrary to prior hypothesis that community-produced debunks rely more heavily on subjective or persuasive wording, we find no evidence that CNs contain a higher average number of persuasion techniques than professional fact-checks. We additionally identify systematic rhetorical differences between CNs and professional debunking efforts, reflecting differences in institutional norms and topical coverage. Finally, we examine how the crowd evaluates persuasive language in CNs and show that, although notes with more persuasive elements receive slightly higher overall helpfulness ratings, crowd raters are effective at penalising the use of particular problematic rhetorical means
翻译:本研究首次对群体撰写与专业撰写辟谣文本中的说服技巧进行了大规模比较。利用来自社区笔记(CNs)、欧盟反虚假信息(EUvsDisinfo)和已知虚假信息数据库(DBKF)的广泛数据集,我们量化了这些事实核查生态系统中说服技巧的普遍程度与类型。与先前关于社区产出的辟谣文本更依赖主观或说服性措辞的假设相反,我们发现没有证据表明社区笔记比专业事实核查包含更高平均数量的说服技巧。我们还识别出社区笔记与专业辟谣工作之间系统性的修辞差异,这反映了制度规范和主题覆盖范围的不同。最后,我们研究了群体如何评估社区笔记中的说服性语言,结果表明:尽管包含更多说服元素的笔记会获得略高的总体帮助性评分,但群体评分者能够有效惩罚特定问题修辞手段的使用。