When Bots Evaluate Humans: Delegation to Bots and the Reshaping of Authority
Nadine Ostern et al.
Abstract
Information systems scholars typically frame the delegation of tasks to AI-based bots as a means of improving efficiency and supporting decision-making. Yet when evaluative tasks are delegated to supervisory bots, authority is transferred to them as well. This study examines how such authority, once conferred on bots, is contested and redistributed by the humans whose work is subject to the bot’s evaluation. We investigate this process in the context of Wikipedia, an online peer-production community in which an antivandalism bot was given the task of autonomously reverting vandalized, i.e., maliciously edited, contributions. Drawing on the concept of performative authority, we traced how community members negotiated the enactment of the bot’s newfound authority with the bot’s developers through authority-negotiation design moves. We found that these moves were the outcome of a recursive process marked by the redistribution of authority across actors in decisions on vandalism, the institutionalization of structures that shaped how authority was influenced, and the evolving actions of the bot itself, which triggered renewed contestation over time. Within the context of a peer-production community, our study offers fresh insight into how authority delegated to bots evolves as a dynamic cocreation process after delegation and how humans reclaim authority once it has been ceded to a supervisory bot. We shed light on how humans preserve authority, professional discretion, and the meaningfulness of their work in AI-mediated settings.
1 citation
Evidence weight
Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40
| F · citation impact | 0.16 × 0.4 = 0.06 |
| M · momentum | 0.53 × 0.15 = 0.08 |
| V · venue signal | 0.50 × 0.05 = 0.03 |
| R · text relevance † | 0.50 × 0.4 = 0.20 |
† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.