Though online platforms claim to amplify Indigenous voices, Indigenous communities are worried that these systems are instead eroding their language and culture. We conduct a community-informed algorithmic audit to explore whether online platforms sustain or endanger Indigenous cultural practice. First, we review ethnographic research pertaining to the cultural anxieties of a specific Indigenous community, as Indigenous peoples are not a monolith. We consider concerns from Kyrgyz communities who believe that platforms are expanding Russia's linguistic influence and threatening their language. Next, we construct and conduct an algorithmic audit in conversation with the community. Our audit investigates deep-seated fears among Kyrgyz caregivers that YouTube encourages their children to speak Russian instead of Kyrgyz, their heritage language. We measure how the YouTube recommendation algorithm prioritizes content across Indigenous and non-Indigenous languages for child users. Our results validate caregiver concerns, as we find that YouTube primarily recommends non-Kyrgyz content to Kyrgyz children, even when children signal clear preferences for Kyrgyz content. Thus, platform recommendations reinforce Kyrgyz children's offline uptake of colonial language ideologies. Finally, we evaluate strategies to align platform behavior with Indigenous values. We identify effective end-user practices for reducing the proportion of Russian-language YouTube recommendations, like cross-generational device sharing. Overall, our work uncovers how platforms can amplify colonial influence, rather than revitalizing Indigenous cultural heritage. We encourage researchers to consider how algorithmic systems can reimpose oppressive power structures that decolonial efforts have sought to dismantle.
翻译:暂无翻译