Abstract
AbstractHow should we think about the ways search engines can go wrong? Following the publication of Safiya Noble's Algorithms of Oppression (Noble, 2018), a view has emerged that racist, sexist, and other problematic results should be thought of as indicative of algorithmic bias. In this paper, I offer an alternative angle on these results, building on Noble's suggestion that search engines are complicit in a racial contract (Mills, 1997). I argue that racist and sexist results should be thought of as part of the workings of the social system of white ignorance. Along the way, I will argue that we should think about search engines not as sources of testimony, but as information-classification systems, and make a preliminary case for the importance of the social epistemology of technology.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.