Symons, John and Alvarado, Ramón (2022) EPISTEMIC INJUSTICE AND DATA SCIENCE TECHNOLOGIES. [Preprint]
Text
Epistemic Injustice and Data Science Tech final.docx - Accepted Version Download (94kB) |
Abstract
Technologies that deploy data science methods are liable to result in epistemic harms involving the diminution of individuals with respect to their standing as knowers or their credibility as sources of testimony. Not all harms of this kind are unjust but when they are we ought to try to prevent or correct them. Epistemically unjust harms will typically intersect with other more familiar and well-studied kinds of harm that result from the design, development, and use of data science technologies. However, we argue that epistemic injustices can be distinguished conceptually from more familiar kinds of harm. We argue that epistemic harms are morally relevant even in cases where those who suffer them are unharmed in other ways. Via a series of examples from the criminal justice system, workplace hierarchies, and educational contexts we explain the kinds of epistemic injustice that can result from common uses of data science technologies.
Export/Citation: | EndNote | BibTeX | Dublin Core | ASCII/Text Citation (Chicago) | HTML Citation | OpenURL |
Social Networking: |
Item Type: | Preprint | |||||||||
---|---|---|---|---|---|---|---|---|---|---|
Creators: |
|
|||||||||
Keywords: | AI ethics, epistemic injustice, data science | |||||||||
Subjects: | Specific Sciences > Artificial Intelligence > AI and Ethics General Issues > Values In Science |
|||||||||
Depositing User: | John Symons | |||||||||
Date Deposited: | 03 Mar 2022 17:49 | |||||||||
Last Modified: | 03 Mar 2022 17:49 | |||||||||
Item ID: | 20292 | |||||||||
Subjects: | Specific Sciences > Artificial Intelligence > AI and Ethics General Issues > Values In Science |
|||||||||
Date: | 29 January 2022 | |||||||||
URI: | https://philsci-archive.pitt.edu/id/eprint/20292 |
Monthly Views for the past 3 years
Monthly Downloads for the past 3 years
Plum Analytics
Actions (login required)
View Item |