Ruhr-Uni-Bochum

Lay Perceptions of Algorithmic Discrimination in the Context of Systemic Injustice

2025

Conference / Journal

Authors

Yixin Zou Markus Langer Nina Grgić-Hlača Gabriel Lima

Research Hub

Research Hub D: Benutzerfreundlichkeit

Research Challenges

RC 11: End-users and Usability

Abstract

Algorithmic fairness research often disregards concerns related to systemic injustice. We study how contextualizing algorithms within systemic injustice impacts lay perceptions of algorithmic discrimination. Using the hiring domain as a case-study, we conduct a 2x3 between-participants experiment (N = 716), studying how people’s views of algorithmic fairness are influenced by information about (i) systemic injustice in historical hiring decisions and (ii) algorithms’ propensity to perpetuate biases learned from past human decisions. We find that shedding light on systemic injustice has heterogeneous effects: participants from historically advantaged groups became more negative about discriminatory algorithms, while those from disadvantaged groups reported more positive attitudes. Explaining that algorithms learn from past human decisions had null effects on people’s views, adding nuances to calls for improving public understanding of algorithms. Our findings reveal that contextualizing algorithms in systemic injustice can have unintended consequences and show how different ways of framing existing inequalities influence perceptions of injustice.

Tags

Education
Empirical Studies on the Perception of Security and Privacy
Behavior