Moderating Digital Societies
Moderation, understood from its Latin origin as regulation and control, refers to the ordering of social discourses and interactions. Today, it encompasses socio-technical processes in their practical variations between online and offline, as well as varying degrees of automation and professionalization of collaboration. There is an ongoing technification of moderation and public discussion more broadly, as well as its infrastructures through the use of algorithms and artificial intelligence, including semi-automated content moderation on platforms. On the human side, this includes both paid work and unpaid volunteer work. In social and media science research, moderation is usually understood as the skill and task of individual humans who are faced with technological alternatives (such as bots, content moderation, etc.). In contrast, the proposed STS-inspired praxeological approach locates moderation as a distributed, temporalized socio-technical practice, i.e. as material semiotic, interrelated, situationally unfolding activities between human and non-human participants. This allows not only to trace the practical organisation of moderation (and when it fails), but also to consider how the subject being moderated contributes to moderation.