At least since Francis Bacon, the slogan 'knowledge is power' has been used to capture the relationship between decision-making at a group level and information. We know that being able to shape the informational environment for a group is a way to shape their decisions; it is essentially a way to make decisions for them. This paper focuses on strategies that are intentionally, by design, impactful on the decision-making capacities of groups, effectively shaping their ability to take advantage of information in their environment. Among these, the best known are political rhetoric, propaganda, and misinformation. The phenomenon this paper brings out from these is a relatively new strategy, which we call slopaganda. According to The Guardian, News Corp Australia is currently churning out 3000 'local' generative AI (GAI) stories each week. In the coming years, such 'generative AI slop' will present multiple knowledge-related (epistemic) challenges. We draw on contemporary research in cognitive science and artificial intelligence to diagnose the problem of slopaganda, describe some recent troubling cases, then suggest several interventions that may help to counter slopaganda.
翻译:至少自弗朗西斯·培根以来,"知识就是力量"这一口号一直被用来概括群体层面的决策与信息之间的关系。我们知道,能够塑造一个群体的信息环境,就是塑造其决策的一种方式;这本质上是一种替他们做决定的方式。本文聚焦于那些通过设计有意影响群体决策能力、从而有效塑造其利用环境信息能力的策略。其中最为人熟知的是政治修辞、宣传和虚假信息。本文从这些现象中揭示出一种相对较新的策略,我们称之为"垃圾信息宣传"。据《卫报》报道,澳大利亚新闻集团目前每周产出3000篇"本地化"的生成式人工智能故事。未来几年,此类"生成式人工智能垃圾信息"将带来多重与知识相关的认知挑战。我们借鉴认知科学和人工智能领域的最新研究,诊断垃圾信息宣传问题,描述近期一些令人不安的案例,并提出若干可能有助于应对垃圾信息宣传的干预措施。