Moderating Digital Societies

Wednesday, 9 July 2025: 15:45
Location: FSE018 (Faculty of Education Sciences (FSE))
Oral Presentation
Trischler RONJA, TU Dortmund, Germany
Whether in work meetings, professional conferences, media broadcasts, panel discussions, or social media groups: across practices and media, moderation is an important social practice for mediating different positions, accesses, and expertise. This makes it a powerful discursive method for collaborative work on societal challenges. However, as an evaluative practice, it also carries the risk of preventing or excluding issues, contributors or positions from public (or private) discourse. While moderation has long been an important social practice, its scope has expanded in digital societies. The paper presents a praxeological approach inspired by Science and Technology Studies (STS) to examine how moderation enables, limits, and organizes the coordination of discursive cooperation under digital conditions.

Moderation, understood from its Latin origin as regulation and control, refers to the ordering of social discourses and interactions. Today, it encompasses socio-technical processes in their practical variations between online and offline, as well as varying degrees of automation and professionalization of collaboration. There is an ongoing technification of moderation and public discussion more broadly, as well as its infrastructures through the use of algorithms and artificial intelligence, including semi-automated content moderation on platforms. On the human side, this includes both paid work and unpaid volunteer work. In social and media science research, moderation is usually understood as the skill and task of individual humans who are faced with technological alternatives (such as bots, content moderation, etc.). In contrast, the proposed STS-inspired praxeological approach locates moderation as a distributed, temporalized socio-technical practice, i.e. as material semiotic, interrelated, situationally unfolding activities between human and non-human participants. This allows not only to trace the practical organisation of moderation (and when it fails), but also to consider how the subject being moderated contributes to moderation.