Abstract

Over the past decade, feminist philosophers have gone a long way toward identifying and explaining the phenomenon that has come to be known as epistemic injustice. Epistemic injustice is injustice occurring within the domain of knowledge (e.g., knowledge production and transmission), which typically impacts structurally marginalized social groups. In this paper, we argue that, as they currently work, algorithms on social media exacerbate the problem of epistemic injustice and related problems of social distrust. In other words, we argue that algorithms on social media recreate and reify the conditions that lead to some groups being systematically denied the full status of knowers, thereby corrupting the epistemic terrain and, with it, systems of social trust and cooperation. We argue that algorithms do this in two ways—namely, via what we are calling algorithmic targeting and algorithmic sorting.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.