Pornography platforms are increasingly required by payment processor business partners to mitigate harm in their content management systems through algorithmic moderation. Demands that adult merchants incorporate these tools are not proportional to instances of harmful content, but a response to the widespread conflation of pornography with harm and risk online. This paper explores co-governance by payment processors calling for algorithmic tools through the case of Pornhub, asking: what standards are required by financial firms, how are these enforced on platforms, and what effects does this arrangement have on porn content? I open with key context regarding the deplatforming of sex, antiporn campaigning and constructions of harm through 'reputational risk’. Following this, I detail financial firms infrastructural influence in platform co-governance. Next, a close reading of adult merchant terms identifies specific clauses calling for algorithmic moderation. Concluding this issue mapping, I provide a taxonomy of moderation tools in place on Pornhub. I close with an issue discussion to consider AI's positioning as a regulatory solution, CSAM data ethics, moderator labour, and the many technical problems obscured by promises of safety through automated content management systems. The resulting review of algorithmic measures enforced by financial firms offers a detailed case of the opaque governance conditions imperilling sexual expression across porn platforms.
Read full abstract