Abstract

The title of this essay announces its core ambition: to propose a model of reading literary texts that synthesizes familiar humanistic approaches with computational ones. In recent years, debates over the use of computers to interpret literature have been fierce. On one side, scholars such as Franco Moretti, Matthew Jockers, Matthew Wilkens, and Andrew Piper defend the deployment of sophisticated machine techniques, like topic modeling and network analysis, to expose macroscale patterns of language and form culled from massive digitized literary corpora.1 On the other side, scholars such as Alexander Galloway, David Golumbia, Tara McPherson, and Alan Liu, who work in the field of New Media Studies, have criticized machine techniques for reducing the complexity of literary texts to mere “data” or for being incommensurable with the goals of critical theory.2 Here we move beyond this impasse by modeling a form of literary analysis

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call