The American Comparative Literature Association (ACLA) Annual Conference, 2015
Session: Rethinking Text As "Process" in the Humanities, Digital and Non-Digital (Group 1)
Organized by: Sayan Bhattacharyya (University of Illinois, Urbana-Champaign)
The notion of a text as reified and stable artifact has long been put under interrogation in literary theory and textual studies. Philological reconstruction of lost texts from fragments, textual criticism and editorial theory, as well as actor-network theory and formalist, structuralist and poststructuralist critique, can all, in their own ways, be construed as affording the means of rethinking the text in terms of process rather than product. Recent years have witnessed developments in both theory and practice, such as posthumanism, speculative realism, new formalism, and object-oriented ontology, as well as the ever-expanding world of text expressed in multiple and new media technologies, readable distantly or closely by human or machine agency. This has created the ground for a productive dialog between scholars and practitioners of digital humanities and non-digital humanities. This seminar intends to stage such an encounter. The seminar invites papers that could explore how processes can be construed as underlying (and/or undermining) text, how text itself can be construed as a generative process, and how technologies, interfaces, media, modes of reading, and modes of critique may be useful for generating and/or challenging such construals. This seminar will aim to be a conversation involving both researchers in more traditional areas of humanities and researchers of digital humanities and new media.
Textual Intimacy in Wearable Technology
(Chelsea Adewunmi, Princeton University)
Whilst recent literature across the fields of literary studies, art history, and museology has examined clothing as a site of textual information, it has produced predominantly historicist readings that focus on clothes as fixed artefacts, to the exclusion of wearable technology - clothing that constitutes a text whose process is ever re-calibrating, anticipatory, and live. This paper focuses on the compression of text and digital information into quotidian sartorialism - wearable technology tailored to the wearer’s most intimate data (e.g. the fingerprint, the heartbeat) - in which the act of wearing becomes the act of reading. For these haptic interfaces, reading occurs reciprocally between the device and the body, an intimacy through which touch becomes text and digits are read digitally. Looking at the integration of text into clothing via modes of 1) non-digital textiles, inclusive of narrative textiles and manuscripts and letters sewn into medieval dresses, 2) wearable haptic interfaces like Ravijour’s “True Love Tester” bra, and 3) book form, from the medieval girdle book to the 21st century wearable book “Sensory Fiction” developed at MIT and inspired by science fiction, I argue that these textual-textiles engender an intimacy that is incorporative - text taken in by the body to transformational effect. By questioning assumptions about the human body and gender inherent in the wearables’ designs, this paper ultimately examines the limits of intimacy, the dangers of exclusively quantitative reading, and the encroaching frequency of surveillance in the 21st century.
Free Necessity and Real Complexity: Causality in Pierre Macherey's A Theory of Literary Production
(Peter Libbey, Duquesne University)
Pierre Macherey’s early work of literary theory, A Theory of Literary Production, is typically regarded as an extension of the Althusserian project of the mid 1960’s. It is not usually seen as containing any real developments of its own. Against this view, my paper argues that Macherey’s text is a theoretical endeavor in its own right. I argue that it can be plausibly read as offering, albeit in an inchoate form, a theory of causality that does not pertain only to literary texts. To articulate this theory, I examine three of Macherey’s core concepts: “free necessity,” “real complexity,” and “determinate conditions of production.” Though he discusses these concepts in terms of their roles in literature, a specific ideological domain, the theory they seem to together imply calls into question any strict separation of ideological superstructure and economic base. Consequently, the text invites us to apply the internal dynamics of the literary text to the relationship between superstructures and their “base.”
The book as an interface: Materiality and Process in Mexican Contemporary Poetry
(Roberto Cruz Arzabal, Universidad Nacional Autónoma de México)
The term "interface" commonly refers to the network of instructions that allow human-computer interaction, however, it can be shifted to literary studies by considering the interaction between other elements and media. As Hayles (“Print is flat”) has considered it, the materiality is the interplay between the form of the text, the meaning and the reading practices; If we understand the interface as “the point of transition between different mediatic layers within any nested system” (Galloway cit. Emerson Reading Writing Interfaces x) some books can be analyzed through interface criticism: the system of organization and production of meaning would be bounded by the book as a physical object; layers will correspond to the textual, visual, and material elements that come into relationship within the literary text. In this sense, I propose to study some mexican poetry books from their interfaces, i. e., analyzing the interplay between the elements and the reading practices: Mónica Nepote’s Hechos diversos, Jessica Díaz and Meir Lobatón’s Monografías and Myriam Moscona’s De par en par.
Tender Buttons and Regular Expressions
(Jordan Buysse, University of Virginia)
In the introduction to a recent edition of New Literary History, Rita Felski describes the nature of current “method wars” in literary scholarship. With the advent of alternative methodologies including distant reading, algorithmic criticism, deep/surface reading, affective criticism, and even the possibility of a new “postcritical standpoint” in literary studies, critics are puzzling over what constitutes the interpretive act at this late date. I take a cue from Steve Ramsay’s work on "algorithmic criticism" as an effort to “channel the heightened objectivity made possible by the machine into the cultivation of those heightened subjectivities necessary for critical work.” My project is an exercise in automating interpretation, in the process repositioning interpretive subjectivity according to something more amenable to Actor Network Theory, a critical practice not engaged in critique but rather constructive (and descriptive) repositioning. My project unfolds according to two reconstitutions of Gertrude Stein’s Tender Buttons. The first is a self-conscious attempt to regularize Stein’s agrammatical style into parse-able syntax. I do this using the Stanford parser, a natural-language processing tool designed to help computers identify grammatical elements in sentences. The machine parser thus creates ‘data’, which is then interrogated alongside an analysis of the ‘logic’ of prepositional phrases in Tender Buttons. Regular expressions (pattern matching scripts that ‘read’ the text according to specific instructions) work to form an alternative engagement with the untamable topology of Stein’s text.
The Temporal Aspects of Processual Text
(Rita Raley, University of California, Santa Barbara)
This presentation will consider the temporality of text as process, particularly the aesthetics of degeneration, the immediacy of live mediatized presentation, and the production of works for epochal time scales (the ‘long now’). Works discussed will include Eugenio Tisselli’s codework, Degenerative, a website that is partially corrupted with each unique page visit and prompts questions about archiving, technological obsolescence, and reader responsibility; Jaromil & Jodi’s Time-Based Text, portable source code that visualizes the composition of a text and records performance time as metadata; and Christian Bok’s Xenotext, a “chemical alphabet” for the encoding in a bacterium capable of surviving extreme climates and radiation exposure, instructions that cause the microbe to excrete a protein that, once translated, forms another poem (a life form as “durable archive”). Some questions: how might we consider time in contemporary media arts in terms other than the durational? What are the relations between the experiential and the speculative? How might the discourse on the processual be thought in relation to the discourses on the artifactual and archival?
Generative vs. Reconstructive Philology: Folklore study and Textual Criticism
(Jessica Merrill, Stanford University)
Nineteenth-century philology was a broad field which included study of the classics, the Bible, medieval texts, folklore, comparative grammar, and historical linguistics. Philologists employed similar methods across these different subjects—they sought to trace genealogical lines of development using the techniques of comparison and classification. Philologists thus approached verbal creativity as an evolutionary process; they collected variants of a text and sought to establish their relationships to each other. Within this paradigm, we can identify two diverging branches of philological study. The first, associated with the textual criticism of Karl Lachmann, focused on ancient, venerated texts, seeking to reconstruct an authorial version. The second branch can be traced to Jacob Grimm’s work on folktales and German mythology. Folklorists who deemed oral tradition to be authorless replaced the concept of an original, authorial version with an abstract, generative schema, which structured each performance of a tale. Jumping forward to recent scholarship in the digital humanities, we can find these two philological models still employed today. The original split between textual criticism and folklore study was motivated by differences between the media of print and oral transmission. Today, these methods are applied to a third, new medium of the digitalized text. How, in the case of digital philology do we account for a researcher’s decision to appeal to one model over another?
Geometries of Desire: Rene Girard’s Mimetic Theory as a Narrative Generation Mechanism
(Graham Sack, Columbia University)
Over the past several years, literary critics have begun researching the relationship between social networks and narrative structure, including numerous efforts to extract character networks from literary works (Elson, Dames, & McKeown, 2010; Moretti, 2011; Sack, 2011). The guiding principle behind literary network analysis is that narratives are not merely depictions of individual experience in language but are also artificial societies whose imaginary social forms can be quantified and analyzed. While such networks are illuminating, they offer at best an impoverished perspective: they are merely a trace of the underlying literary phenomenon—a forensic tool, like an X-ray, for describing plot and character after the fact. They are static snapshots of a dynamic process. My concern in this paper is to invert and, thereby, elevate the use of literary networks for understanding narrative: to utilize networks not as a descriptive tool, but rather as a generative mechanism. Instead of making networks out of narratives—by, for example, time-slicing texts and extracting dialogue—my concern is to make narratives out of networks—by simulating their dynamics. This paper focuses particularly on narrative generation based on Girard’s theory of “triangular desire,” first proposed in Deceit, Desire, and the Novel (1961). The Girardian mechanism is particularly amenable to narrative generation: (1) nodes are interpretable as characters, (2) links are interpretable as character relationships, (3) network transitions are interpretable as narrative events, (4) cascade effects produce sequences of causal events that lead from instability to stability, satisfying the requirements of narrative specified by Aristotle, Todorov, and Forster.
Towards posthumanist reading? How to do things with (mere) words from text
(Sayan Bhattacharyya, University of Illinois, Urbana-Champaign)
Distant reading typically involves computational analysis of large text corpora, but does not, by itself, preclude the possibility of close reading supplementing it. Non-consumptive reading, on the other hand, is a recent concept in the information sciences, in which consumption of the digital text as a human-readable artifact is prohibited, thereby foreclosing the possibility of close reading. Such prohibitions typically arise from restrictions on intellectual property, such as copyright laws. Functionalities for non-consumptive reading, such as those being developed at the HathiTrust Research Center (HTRC) associated with the HathiTrust Digital Library, encourage rethinking the nature and limits of humanistic inquiry. We argue that non-consumptive reading of digitized text corpora, while dictated by practical necessity, has created a fertile opportunity for theoretical reflection related to such contemporary theoretical issues as the posthuman. Theoretical discussions will be grounded in a tutorial demonstration of the practical uses of the analytical and computational services being developed by the HTRC.