Abstract
Age verification is currently gaining traction among some western democracies as a means to restrict minors’ access to online pornography. In this article we consider the ramifications of applying age estimation software to this task. We analyse a public dataset of 10,139 facial images processed through a commonly used high-performance convolutional neural network approach and find significant inconsistencies in classification performance. Notably, the software demonstrates racial bias, with highest accuracy for the Caucasian category and lowest accuracy for the African category. It also displays age and gender bias, with lower accuracy for young males compared to young females. In addition to underwhelming technical performance, we argue that the concept of employing automated processes to restrict access to pornography is not only problematic but fundamentally misconceived. The systems being proposed to automate age verification create greater user data privacy risks and divert resourcing that could be spent on strategies that are proven to support healthy sexual development. Ultimately, mandatory age verification systems create barriers to post-pubescent young people seeking information about sex online. Our study concludes that the underlying problem with age verification, therefore, is not only technical but more profoundly political: even if the system can be made to work, it should not be.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.