Differences that Make No Difference and Ambiguities that Do
Differences that Make No Difference and Ambiguities that Do
- Research Article
- 10.1086/703888
- Aug 1, 2019
- Modern Philology
<i>Enumerations: Data and Literary Study</i>. Andrew Piper. Chicago: University of Chicago Press, 2018. Pp. xiii+243.
- Preprint Article
- 10.59348/5pgkm-pzh88
- Apr 7, 2020
In ultra-exciting news – thanks to my Leverhulme Prize – I am very pleased to be able to be able to say that my book, <i> Close Reading with Computers: Textual Scholarship, Computational Formalism, and David Mitchell’s <i> Cloud Atlas </i> </i> , is now openly accessible (gold OA under a CC BY-SA 4.0 license) at Stanford University Press! It will soon be in the OAPEN Library and on the Stanford site, but for now it’s freely available in BIROn.
- Single Book
13
- 10.21627/9781503609372
- Jan 1, 2019
This book is the first full-length monograph to bring a range of computational methods to bear in a sustained fashion, on a single novel, at the micro-level. While most contemporary digital studies are interested in distant-reading paradigms for large-scale literary history – using their digital methods as a telescope – following calls by Alan Liu and Tanya E. Clement, Close Reading with Computers instead asks what happens when such techniques function as a microscope
- Preprint Article
- 10.59348/2cny1-1t614
- Jun 4, 2019
Today marks the publication of my latest book, _Close Reading with Computers: Textual Scholarship, Computational Formalism, and David Mitchell's_ Cloud Atlas, at Stanford University Press.
- Research Article
- 10.1093/english/efz050
- Dec 11, 2019
- English: Journal of the English Association
Close Reading with Computers: Textual Scholarship, Computational Formalism, and David Mitchell’s Cloud Atlas. By Martin Paul Eve
- Research Article
5
- 10.3368/ss.46.3.76
- Jan 1, 2017
- SubStance
David Mitchell’s Cloud Atlas (2004) contains six different generic registers. This article is the first to explore computationally the linguistic mechanisms that create these genre effects. Authorship attribution techniques incorrectly cluster the chapters of Cloud Atlas as distinct ‘authors’ using anything above the nineteen most-common words. This has implications for understandings of literary style and authorship. The seafaring parts of Mitchell’s novel, however, do not correlate with the writings of Herman Melville using Burrows’s delta method. Part-of-speech trigram visualization and analysis reveals the unique present-tense linguistic phrasings (NNP NNP VBZ and NNP VBZ DT) that lend pace to the Luisa Rey section of the novel.
- Research Article
8
- 10.1016/j.langsci.2021.101359
- Feb 2, 2021
- Language Sciences
Close reading and distance: between invariance and a rhetoric of embodiment
- Research Article
2
- 10.1353/sub.2017.0033
- Jan 1, 2017
- SubStance
Close Reading with Computers: Genre Signals, Parts of Speech, and David Mitchell’s Cloud Atlas
- Research Article
- 10.1215/00295132-9353971
- Nov 1, 2021
- Novel
Close Reading on a Global Scale
- Book Chapter
- 10.1017/9781009273497.011
- Dec 22, 2022
The book has argued – via keywords and close readings – that certain forms of literary representation derive from their author’s concerns about what we now call secularization. It posits that cultural concepts (faith, worlds, nostalgia), forms of mentation (indulgence, figuring), literary forms (novelistic narration, historical fiction), and even fiction and modes of reading themselves result from the conservative orientation of their authors. In so doing, it argues for treating the secular and the postsecular as relevant not just to politics or religion but also to literary forms and innovation, theories of mind, and conceptualizations of temporality and mentation more generally. In fact, a central insight of the book is that the postsecular is motivated not necessarily by political or religious opposition – or even by a renegotiation of the relationship between the religious and the secular – but rather by changes wrought by secularization across the spheres of cultural and social life, and it argues that the literary sphere provided both the site and the methods for that process. This study has also demonstrated that secularization and liberalism are not separate from postsecularization and conservatism – rather, they are interdependent, as this study’s keywords suggest: Faith and indulgence, transformed for a secular system, make possible belief and toleration; imagining worlds and reading literary history are embedded in secular spatiality and temporality, even as they reveal the offenses and limitations wrought by these secular categories; passivity and the revolution/nostalgia dynamic both keep alive and keep in check the liberal fiction of human agency. The conservative and the postsecular are thus constantly in tension with the liberal and secular. This is why it is no coincidence that the concerns of people not well served by liberalism – not just royals but also women and enslaved people – occur repeatedly in this study. This is also why the two characteristics that most distinguish these conservative writers – their opposition to revolution and their innovative literary formations – are connected.
- Research Article
1
- 10.1093/llc/fqac074
- Jan 9, 2023
- Digital Scholarship in the Humanities
This paper examines some of the issues surrounding this ‘computational turn’ and proposes a ‘human focused’ approach. It begins by addressing questions concerning whether there is a need empirical evidence in literary studies, and the roles played by human and technical agents in interpretative practices. It adopts Don Ihde Postphenomenological ideas (especially ‘embodiment’ and ‘hermeneutics’ human-technology relations) to expatiate on the nature of relationship that exists between a literary scholar and a computational tool during interpretative practices. On one hand, it uses postphenomenomelogy as a theoretical framework to provides rich conceptual terminologies by which we could interrogate humanists-computer relationship within the practice of computational textual analysis. And on the other hand, uses postphenomenology method to highlight the notion of subjectivity in contrast to the promise of observer-independent objectivity. The research appraise the impact of quantification and visualizations in literary studies using the research output of Franco Moretti and his colleagues at the Stanford Lit Lab, as well as performs textual experimentation on some corpora using Stefan Sinclair and Geoffrey Rockwell’s Voyant tools and Python NLP packages. But refutes the idea that quantification and visualization of textual data with the use of computational tools and methods could guarantee objectivity of textual interpretation in literary studies. The argument in this paper is divided across its three sections: The first section discusses the goal of reading; it concerns ‘Close reading’ and ‘Distant reading’. The second section questions the possibility of objectivity in textual interpretation using quantifiable and visual evidence provided as the output of the computational analysis of humanistic texts. Then, in the third section defends the following claims: a) the human person is the principal actor in the interpretation process, and all other forms of representation or visualisation of the text are meant to aid humans; b. data in the humanities are not limited to printed texts, but include digitised, born-digital and electronic text in various digital forms (images, sounds, videos, etc.); c. the humanities aim more at interpretive practices than a quest for verifiable knowledge; d. interpretative practices in the humanities focus on humans and their experiences; e. attention must be drawn to human developers' subjectivity whenever we are using computational interpretative tools – on the ground that this will help in bracketing of our biases, prejudices, preconceptions, and theoretical frameworks. The paper concludes with the argument that computational tools used for textual analysis in the humanities need interpretation as they are not neutral in hermeneutic practices. It argues that the humans involved in the creation of those tools are prone to errors, have preferences, and incorporate their subjective ideas into developmental processes. Then proposes ‘Hermeneutical Postphenomenology’ as an ideal lens through which the claim to objectivity could be debunked. Then recommends that our productivity in textual scholarship can be enriched when we understand the true nature of the relationship between the human inquirer and technical agents within the cognitive assemblage.
- Research Article
11
- 10.16995/olh.82
- Aug 10, 2016
- Open Library of Humanities
In 2003, David Mitchell’s editorial contact at the US branch of Random House moved from the publisher, leaving the American edition of Cloud Atlas (2004) without an editor for approximately three months. Meanwhile, the UK edition of the manuscript was undergoing a series of editorial changes and rewrites that were never synchronised back into the US edition of the text. When the process was resumed at Random House under the editorial guidance of David Ebershoff, changes from New York were likewise not imported back into the UK edition. In the section entitled ‘An Orison of Sonmi ~451’ these desynchronised rewritings are nearly total at the level of linguistic expression between UK and US paperbacks/electronic editions and there are a range of sub-episodes that only feature in one or other of the published editions. Within the constraints of copyright on contemporary fiction, this article sets out this textual variance and visually plots the re-ordering and re-writing of the Sonmi section of the novel across versions. Further to this, I also signal here a number of reasons why critics might need to consider the production processes of contemporary fiction in order to deal with the multiple and different editions of this text and other contemporary novels.
- Research Article
- 10.55016/ojs/jah.v2025y2025.80848
- Jan 23, 2025
- Journal of Applied Hermeneutics
This paper explores two important ways in which close reading differs from natural language processing, the use of computer programming to decode, process, and replicate messages within a human language. It does so in order to highlight distinctive features of close reading that are not replicated by natural language processing. The first point of distinction concerns the nature of the meaning generated in each case. While natural language processing proceeds on the principle that a text’s meaning can be deciphered by applying the rules governing the language in which the text is written, close reading is premised on the idea that this meaning lies in the interplay that the text prompts within readers. While the semantic theory of meaning upon which natural language processing programs are based is often taken for granted today, I draw from phenomenological and hermeneutic theories, particularly Wolfgang Iser and Hans-Georg Gadamer, to explain why a different theory of meaning is necessary for understanding the meaning generated by close reading. Second, while natural language processing programs are considered successful when they generate what epistemologists call true beliefs about a text, I argue that close reading aims first and foremost at the development, not of true belief, but of understanding. To develop this distinction, I draw from recent scholarship on the epistemology of education, including work by Duncan Pritchard, to explain how understanding differs from true belief and why attainment of the latter is less educationally significant than the former.
- Dissertation
- 10.25394/pgs.15057330.v1
- Jul 27, 2021
This project presents a case study of postmodern trauma, working at the boundaries of the humanities and computer science to produce an in-depth examination of trauma writing in David Foster Wallace’s novel Infinite Jest. The goal of this project is to examine the intricacies of syntax and language in postmodern trauma writing through an iterative process I refer to as broken reading, which combines traditional humanities methodologies (close reading) and distant, computational methodologies (Natural Language Processing). Broken reading begins with close reading, then ventures into the distant reading processes of sentiment analysis and entity analysis, and then returns again to close reading when the data must be analyzed and the broken computational elements must be corrected. While examining the syntactical structure of traumatic and non-traumatic passages through this broken reading methodology, I found that Wallace represents trauma as gendered. The male characters in the novel, when recollecting past traumata or undergoing traumatic events, maintain their subject status, recognize those around them as subjects, and are able to engage actively with the world around them. On the other hand, the female characters in the novel are depicted as lacking the same capacities for subjectivity and action. Through computational text analysis, it becomes clear that Wallace writes female trauma in a way that reflects their lack of agency and subjectivity while he writes male trauma in a way that maintains their agency and subjectivity. Through close reading, I was able to discover qualitative differences in Wallace’s representations of trauma and form initial observations about syntactical and linguistic patterns; through distant reading, I was able to quantify the differences I uncovered through close reading by conducting part of speech tagging, entity analysis, semantic analysis, and sentiment analysis. Distant reading led me to discover elements of the text that I had not noticed previously, despite the occasional flaw in computation. The analyses I produced through this broken reading process grew richer because of failure—when I failed as an interpreter, and when computational analysis failed, these failures gave me further insight into the trauma writing within the novel. Ultimately, there are marked syntactical and linguistic differences between the way that Wallace represents male and female trauma, which points toward the larger question of whether other white male postmodern authors gender trauma in their writings, too. This study has generated a prototype model for the broken reading methodology, which can be used to further examine postmodern trauma writing.
- Research Article
- 10.59075/p1qfq017
- May 25, 2025
- The Critical Review of Social Sciences Studies
The 1947 Partition of British India is the most traumatic event in South Asia, displacing over 14 million and killing almost a million people (Talbot & Singh, 2009). Literature born out of this break in history records the collective and psychological traumas suffered by individuals and groups.The work of trauma and memory in South Asian Partition fiction is critically examined in this academic study, with a focus on narrative meaning in the works of Saadat Hasan Manto, Khushwant Singh, Bapsi Sidhwa, Amrita Pritam, Intizar Hussain, and others. Drawing on trauma theory (LaCapra, 2001; Caruth, 1996) and memory studies (Assmann, 2011), the research investigates the ways in which novels write individual suffering onto shared memory and reconstruct violent histories. Its interests lie in fractured identities, suppressed histories, gendered trauma, and intergenerational transmission of memory. The essay explores Partition literature as a site of witnessing, grieving, and resistance to the erasures of official history through close reading and thematic analysis.The paper also reveals the transnational medium of Partition memory in diaspora fiction. With the coming together of literary analysis, historical fiction, and theories of trauma, this essay excavates the high-stakes game of narrative, memory, and self in post-Partition literary imagination. The evidence is presented to illustrate how Partition literature is not a record of suffering but a successful site of testimonial history and moral intervention.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.