Abstract

According to recent legislative initiatives, non-consensual pornography is a crime in several countries and social media providers have a duty to provide their users easy to use mechanisms to report abuses. In this paper, we analyse the state of the art of the interfaces for reporting non-consensual pornography from the victim’s perspective. Firstly, we analysed 45 content sharing platforms where aggressors might post non-consensual pornography. The analysis identified three distinct interaction styles for reporting the crime: Scriptum (a text-field where the user verbally describes the abuse), Bonam (a multilayered menu that includes a correct option), and Malam (a multilayered menu that does not include a correct option). Secondly, we conducted a within-subject study to evaluate the experience elicited by these interaction styles. Participants (N = 39) were given a scenario and asked to report six blurred images as non-consensual pornography using a medium-fidelity prototype. The results exposed complex trade-offs between clarity, efficiency, and distress among the different interaction styles. These trade-offs open foundational research directions transcending boundaries between human-computer interaction and multimedia studies and interfacing computer science research with the law.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call