Abstract

In this contribution to a symposium on “Data and Democracy” hosted by the Knight First Amendment Institute, we explore the administrative state’s growing use of complex statistical models and the challenges this trend poses for accountable administrative governance. We document how agencies’ use of big data can obscure critical framing decisions underlying policies, hide subjectivity in the design and development of models, and undermine scientific integrity. Legal process requirements should in theory counteract these tendencies to sideline public deliberation and oversight. But in practice, the threat of judicial review, protracted comment processes, and other features of administrative law sometimes tacitly reward agencies for developing and using algorithmic tools that are inaccessible to the public. To address these challenges, we propose standardized, interdisciplinary processes that encourage agency staff to comprehensibly explain—using best practices—the framing, algorithm choices, and procedures used to ensure the integrity of their analyses. We also suggest the use of rewards, such as increased judicial deference for accessible explanations, to promote the development of high quality and transparent models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call