Abstract

In 2020, the Office of the Privacy Commissioner of Canada (OPCC) led a joint federal-provincial investigation into privacy violations stemming from the use of facial recognition technologies. The investigation was prompted specifically by the mobilization of Clearview AI’s facial recognition software in law enforcement, including by regional police services as well as the Royal Canadian Mounted Police. Clearview AI’s technology is based on scraping social media images, which, as the investigation found, constitutes a privacy law violation according to provincial and federal private sector legislation. In response to the investigation, Clearview AI claimed that consent for scraping social media images was not required from users because the information is already public. This common fallacy of social media privacy serves as a pivot point for the integration of digital policy literacy into the OPCC’s digital literacy materials in order to consider the regulatory environment around digital media, alongside their political-economic and infrastructural components. Digital policy literacy is a model that expands what is typically an individual- or organization-level responsibility for privacy protection by considering the wider socio-technical context in which a company like Clearview can emerge.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call