Algoritmische Rechtvaardigheid
| Authors | |
|---|---|
| Publication date | 12-2024 |
| Journal | Algemeen Nederlands Tijdschrift voor Wijsbegeerte |
| Volume | Issue number | 116 | 4 |
| Pages (from-to) | 369-387 |
| Organisations |
|
| Abstract |
Algorithmic bias can lead to harmful forms of algorithmic discrimination. In this article, I argue that technology does not exist in a vacuum and is always part of power relations. I therefore criticize technological fixes that reducesocial problems to a technical solution. Dominant solutions like ‘debiasing’,while important, avoid questions about deep-rooted injustices. They ‘accept’and work with and within the frames of existing social (power) structures.Justice requires considering the structural dimensions of inequality. I draw attention to Langdon Winner’s call to ask whether a technology is ‘just’rather than approaching the issue of algorithmic discrimination from a solutionist angle of optimization and functionality. I propose that we draw inspiration from the work of philosophers who approach justicefrom a structural or systemic perspective. This results in a philosophical approach that stretches the concept of ‘discrimination’ and exposes therelationships between inequalities. Moreover, it questions the structures and boundaries in which the technology is embedded. Finally, I criticize the current hype around AI that distracts us from the fact that we have had existential problems with AI for a long time and that these problems are deeply intertwined with (the history of) our social power relations.
|
| Document type | Article |
| Language | Dutch |
| Published at | https://doi.org/10.5117/ANtW2024.4.004.LANZ |
| Downloads |
ANTW2024.4.004.LANZ
(Final published version)
|
| Permalink to this page | |
