Algorithmic Processes: Analysis and Creation

The term “algorithm,” in most individuals’ minds, likely conjures up images of computers, programming languages, or perhaps database queries. Something which I addressed in class is a more common understanding of the term “algorithm” which is, simply put, a set of instructions. The OED defines an algorithm as, “a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer” (italics mine). While the OED incorporates the word ‘computer’ into its definition, it only states that this process is ‘especially’ true of computers, not to the exclusion of other problem-solving apparatus, like the human brain. Although a computer may be superior at mindless calculation and locating specific data, a human may be superior at drawing connection between abstract ideas through sheer familiarity with a literary body, or even “eureka!” moments. Both computers and humans, then, are capable of following a process in order to analyze, or even to create something novel, new, and thought provoking. If these assertions hold true, then there is an obvious reason to utilize all resources available to us, both computers and our own problem-solving capabilities, in any analysis and most relevant to this discussion, literary criticism.

Statistics, although a branch of mathematics, can be applied in any number of creative an imaginative ways in the humanities and in literary criticism; by investigating the frequencies of certain words used by an author to reveal trends, for example. While the incorporation of statistical mathematics for the purposes of enhancing one’s argumentation may not seem altogether unreasonable, there is a division between those whose utilize “algorithmic” analyses and those who do not. More broadly speaking, there is a division between a scientific approach and an artistic approach to any analysis. Ramsay’s goal in Reading Machines is to incorporate the two disparate styles of analysis into one discipline. In his discussion at the beginning of Part II, “Potential Literature,” Ramsay weighs the two sides of the argument and presents a compromise between the two. This compromise becomes evident in statements such as, “The leap from frequencies to meanings must always be a risky one,” and “Lower-level features are easy to count but impossible to interpret in terms of style; counting images or other high-level structures brings problems of excessive intervention at the categorization stage, and thus unreliability and circularity” (Ramsay 19). Interpreting “lower-level” as the frequencies of words, for example, and “high-level” as the discourse on the literary effects of those words, there becomes an evident struggle between remaining grounded in one or the other. The scientific or algorithmic approach, while better suited to the “lower-level” style analysis, provides a foundation upon which the artistic or discourse-based approach may be built. To use one to the exclusion of the other seems only to detract from the potential of an analysis, whereas utilizing both ought to enhance the analysis.

Ramsay, continuing from these comments by Van Peer, summarizes this struggle by stating that, “Without grounding in the language game of denotation, we risk the “circular reasoning” of a discourse that grounds itself in further discourse – the “unreliability” of a claim that has nothing to recommend it but its rhetorical power to persuade” (19). Apparently taking the side of the “lower-level” analysis, Ramsay explains that the scientific and artistic approaches each have their own necessity in literary criticism. Denotation, however, is rarely the primary focus of any literary critique; the literal interpretation typically only underscores the literary interpretations to be extracted or created. While it is difficult to find the middle ground between discussing primary and secondary sources, the “artistic” approach, if it may be stated so, tends to fall on the side of discourse about discourse, or the discourse about discourse about discourse, etc., eventually straying from the original text altogether. This comment, while not meant to enrage or belittle, speaks true for many discussions in which a literary critic of a text creates an analysis based on the analysis of another critic, who in turn perhaps analyzed a work according to Barthes, Lacan, Derrida, or any other prominent literary theorist of your choosing. At what point does the discourse provided become too far removed from the content/denotations of the original text that it becomes “circular reasoning,” exactly as Ramsay states? This extreme position on the “arts” approach, if it may be stated as such, should not be held as accurately describing the entirety of literary criticism, only as a position against which we may pit ourselves if we are on the other side of the fence.

Ramsay presents instances where computers are outright better suited to the reading task than human beings. Consider, for example, his discussion on Queneau’s Cent Mille Milliards de poems (100,000,000,000,000 Poems) (25). Given the calculations provided, which state that “a person reading the book for twenty-four hours a day would take 190,258,751 years to finish it” (26), no single individual human could possibility complete such a reading task. Here, using this example, Ramsay implies without directly stating that a computer would be better suited to this task than a human. Assuming this implication is not totally unreasonable, we are able to continue this line of reasoning to a more controversial topic than computer-assisted analysis – computer poetry. The Matthews algorithm, which “remaps the data structure of a set of linguistic units (letters of words, lines of poems, paragraphs of novels) into a two dimensional tabular array(29), exemplifies the possibilities of creative algorithmic application onto language and literature. Ramsay’s comment on this algorithm when applied onto a 4×4 letter array, that “These maneuvers thus create a serendipitous morphology – an instantiation of the phonemic potentiality of ordinary words” (30), suggests that linguistic creativity is not a unique capacity to the human mind, but can be performed (or, at least emulated) by computers following the instructions from algorithms as well. The way in which Ramsay discusses the “morphology” and the “phonemic possibilities” implies that algorithms are able to rearrange abstract representations of sound into word structures, which is perhaps very valuable in understanding human cognition through language manipulation. The ways in which human minds are capable of reorganization of sound, according to phonological constraints of their language, into permissible word-forms seems not unlike the algorithm discussed here. The argument, simply stated, is that computers and human minds are both algorithmically driven, the capabilities of the computer and the human mind are similar, and that in analysis or creation, both the computer and the human mind are well suited to the task.

 

Discussion Questions:

1)      Is computer-poetry, which follows an algorithmic process, like or unlike the kind of poetry a human being might write?

2)      I have made reference to the fact that human cognition can be understood in the way that we interpret, control, and manipulate language. Can you think of any words, phrases, sentences, or other examples which might support this assertion?

3)      I have made the claim that human brains and computers are alike in the way that both are algorithmically driven. Do you agree? Why or why not?

4)      Should literal or literary interpretations be the more prominent focus in English discussion/discourse?

This entry was posted in Presentation Summary. Bookmark the permalink.

One Response to Algorithmic Processes: Analysis and Creation

  1. hertz says:

    Hi Kirk:

    I find your comparison of human and computer-generated texts compelling, and feel it articulates many of the issues raised in Ramsay’s book. Your discussion of Queneau’s “Cent Mille Milliards de poèmes” sparked my attention, as it parallels my examination of the printed page as a unit of measurement and container for meaning–which, if you’re curious, you can read about here (artistic photos of Queneau’s book included):

    http://amillionatomsofsoftblue.wordpress.com/2013/02/15/pages/

    In this sense, I wonder if the disjunction between human and computer-generated texts has less to do with the verbal elements themselves, than with the material texts in which they are embedded. Interestingly, Queneau’s book contains an epigraph from Alan Mathison Turing which reads, “Seule une machine peut apprécier un sonnet écrit par une autre machine” or “only a machine can appreciate a sonnet written by another machine.” This relates to your first question (and some of Ramsay’s own poetry on 26-7), in that computer-generated poetry demands a redefinition of ‘writing’: when we read Queneau’s book, aren’t we pretty much re-writing his text? This, of course, relates to Ramsay’s concept of reading as deformation–which Jonathan discussed in his topical presentation.

    Further, I wonder at Ramsay’s division of language into lower-level and higher-level linguistic phenomena: doesn’t this place literary criticism as parasitic upon “brute facts” or scientific discourse? Again, there is a hierarchy of disciplines pervading this book that–try as he might–Ramsay cannot avoid. Is literary criticism really just “discourse about discourse,” or can the same be said of scientific theories, which build–like cells to tissues to organs to systems–one upon the other? Doesn’t literary criticism always necessitate a ‘return to the text’ just as scientific methodology must be based on data? What this really comes down to, I think, is the distinction between words and numbers: where do language and mathematics overlap?

    I am especially intrigued by your second question, and wonder if you might provide an example. Are you referring here to assertive, directive, commissive, expressive, and performative utterances–or some other type of linguistic phenomena? I particularly like your discussion of cognition, and the ways in which humans are capable of reordering language into permissible word-forms according to certain phonological constraints. In that manner, I would certainly agree that cognition is algorithmic.

Leave a Reply

Your email address will not be published. Required fields are marked *