Abstract

AbstractEvidence shows that appropriate use of technology in education has the potential to increase the effectiveness of, eg, teaching, learning and student support. There is also evidence that technology can introduce new problems and ethical issues, eg, student privacy. This article maps some limitations of technological approaches that ensure student data privacy in learning analytics from a critical data studies (CDS) perspective. In this conceptual article, we map the claims, grounds and warrants of technological solutions to maintaining student data privacy in learning analytics. Our findings suggest that many technological solutions are based on assumptions, such as that individuals have control over their data (‘data as commodity’), which can be exchanged under agreed conditions, or that individuals embrace their personal data privacy as a human right to be respected and protected. Regulating student data privacy in the context of learning analytics through technology mostly depends on institutional data governance, consent, data security and accountability. We consider alternative approaches to viewing (student) data privacy, such as contextual integrity; data privacy as ontological; group privacy; and indigenous understandings of privacy. Such perspectives destabilise many assumptions informing technological solutions, including privacy enhancing technology (PET). Practitioner notesWhat is already known about this topic Various actors (including those in higher education) have access to and collect, use and analyse greater volumes of personal (student) data, with finer granularity, increasingly from multiplatforms and data sources. There is growing awareness and concern about individual (student) privacy. Privacy enhancing technologies (PETs) offer a range of solutions to individuals to protect their data privacy. What this paper adds A review of the assumption that technology provides adequate or complete solutions for ensuring individual data privacy. A mapping of five alternative understandings of personal data privacy and its implications for technological solutions. Consideration of implications for the protection of student privacy in learning analytics. Implications for practice and/or policy Student data privacy is not only a technological problem to be solved but should also be understood as a social problem. The use of PETs offers some solutions for data privacy in learning analytics. Strategies to protect student data privacy should include student agency, literacy and a whole‐system approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call