Ripples on Water: ‘primacy of pattern’ in Woolf’s novel, “The Waves”

In this (traditional) presentation I will discuss Ramsay’s algorithmic analysis of Virginia Woolf’s novel, The Waves. In so doing, I will argue that traditional literary criticism is not so different from scientific inquiry. While critics like Jonathan Gottschall assert that “literary studies should become more like the sciences”, both disciplines already revolve around the ‘primacy of pattern’: what Stephen Ramsay identifies as “the basic hermeneutical function that unites art, science, and criticism” (4; xi). When we analyze a text, we search for frequencies: whether of themes, tropes, or individual words. In a critical paper, we marshal those frequencies as evidence of overarching trends or patterns—extrapolating about the use of words in a text, and moving from individual citations to “the grander rhetorical formations that constitute critical reading” (Ramsay 17). Pattern-making, in other words, is where paper-based literary criticism and computational analysis overlap, since “critical reading practices already contain elements of the algorithmic” (Ramsay 16).

As literary critics, our brains are finely honed text analysis tools: “We read out of order, we translate and paraphrase, we look only at certain words or certain constellations” (Ramsay 48). By adopting a theoretical standpoint, we select a set of meanings from a field of possible meanings, just like an algorithm. As such, literary criticism is algorithmic in that it is a process by which we apply certain limits or constraints to our analyses. Where the paper-based literary critic searches manually for water imagery in The Waves, the computer-based critic uses an algorithm capable of identifying every instance of the word “water” in Woolf’s novel. With the application of text analysis tools, tables of frequencies, charts, and graphs generated by the algorithm replace marginal notes generated by the paper-based literary critic.

Thus computer-based criticism is not so different from paper-based criticism in that both involve the identification of patterns and the application of rules or constraints. For all intents and purposes, the computer just speeds up the process: it “revolutionalizes, not because it proposes an alternative to the basic hermeneutic procedure, but because it reimagines that procedure at new scales, with new speeds, and among new sets of conditions” (Ramsay 31). Using TAPoR to find water imagery in The Waves is basically the same thing as using a concordance, but “at a different scale with expanded powers of observation” (Ramsay 17).

Yet if computers can answer all of our questions, what makes literary studies ‘special’ or distinct from scientific inquiry? If criticism is, by nature, ambiguous, how do we quantify our qualitative evidence? Should we? The problem is with analyses like Miriam Wallace’s:

There is no experiment that can verify the idea that Woolf’s ‘playful formal style’ reformulates subjectivity or that her ‘elision of corporeal materiality’ exceeds the dominant Western subject. There is no control group that can contain ‘current feminist reconfigurations.’ (Ramsay 7)

Algorithmic criticism, then, cannot turn literary criticism into a science—nor should it seek to do so. Instead, it advocates the use of scientific method to unleash the latent potentialities of a text, and takes already existing hermeneutic practices to new heights and speeds. Text analysis tools like Voyeur, HyperPo, Wordhoard, MONK, and TAPoR are no different than psychoanalytic or Marxist approaches: they allow for alternate perspectives on canonical works, and enable us to test old theories and new hypotheses (they can also reveal a lot about your own writing; try plugging your blog summary into HyperPo). By running Woolf’s novel through TAPoR, we are not trying to solve Woolf: we are “trying to ensure that discussion of The Waves continues” (Ramsay 15).

The Discussion, continued:

❧  Is art quantifiable? Does algorithmic criticism implicitly favour quantitative over qualitative evidence and quantity over quality?

❧  Is there a risk in non-reading and will this trigger a “slow-reading” movement?

❧  If you could design a text analysis tool to answer any question about any literary work, what would it be?

About Sarah Hertz

Alone and young and wilful and wildhearted, alone amid a waste of wild air and brackish waters and the seaharvest of shells and tangle and veiled grey sunlight.
This entry was posted in Presentation Summary. Bookmark the permalink.

One Response to Ripples on Water: ‘primacy of pattern’ in Woolf’s novel, “The Waves”

  1. grantkanigan says:

    As Sarah stated in her presentation summary, the algorithmic approach to literature that Ramsay suggests merely “speeds up the process” of criticizing texts. In my own reading of Ramsay’s text, it also seems as though the computational tools one can apply to texts get towards the “brute facts” of a text. When Ramsay shows that one of the top words for a character in “The Waves” is “Nile,” that illuminates that the character is obsessed with the Egyptian landmark; this is a brute fact within Woolf’s text. However, there doesn’t seem to be any kind of tool or algorithm to suggest why Woolf’s character is so obsessed with the Nile. That’s for the seasoned critic to determine using the brute facts of the text. Essentially, critics must ‘connect the dots.’ Additionally, Ramsay states “few readers of The Waves would fail to see some emergence of pattern on this list. Many have noted that Louis seems obsessed with the Nile,” (Ramsay, 18). Ramsay is aware that some of the facts of the text are fairly obvious but he then modifies the “tf-idf’ equation to find out what a common theme for men is within The Waves and has an answer immediately. This is a revolutionary tool; in seconds one can find out the facts of a text as long as they can apply their thoughts to what aspect of the text to input into the “tf-idf” equation.
    Overall, being able to use algorithmic tools to analyze along with a scholarly knowledge to connect the dots of the text data are both invaluable. Many state that ‘art is subjective’ and therefore it can be interpreted in virtually any way possible. This is somewhat true, but as we discussed in class, some interpretations are more valid than one another. Ramsay’s algorithmic criticism gives evidence to support certain interpretations of the text, and his methods give the critic 100% of the, (quantitative), text data. Depending on how you use an algorithm in the text also allows a formula to pick up every possible occurrence of a pattern, removing any possibility of a misreading.

    Is art quantifiable? Does algorithmic criticism implicitly favour quantitative over qualitative evidence and quantity over quality?

    Yes, art is quantifiable; the patterns that emerge from repetition or certain uses of word illuminate how that word or words are meant to be interpreted. I think algorithmic criticism does favour quantitative evidence, and merely relying on algorithmic criticism would result in many missed out aspects of the text. It’s possible that a series of similar words with the same meaning or a single word used only once in the end of a text could sum up the ideas of the entire text; using algorithmic criticism might fail to see that. For a text to be fully explored, one must look at both the qualitative and quantitative aspects of it. I’m not sure algorithmic criticism is at a point where it can find all of the meanings behind a text, and that is where the subjective eye of an informed reader is important, in unison with textual data/brute facts.

    Is there a risk in non-reading and will this trigger a “slow-reading” movement?

    Yes, there is a risk. as I stated above, for a text to be fully explored, one would have to use their own scholarly knowledge in unison with algorithmic data in order to adequately read a text.
    I think it would do just the opposite. If people were able to easily find certain patterns within a text it might lead them to speed-read/skim the text and then use algorithms to check if they missed anything from skimming through a text.

    If you could design a text analysis tool to answer any question about any literary work, what would it be?

    Something that would be able to pick up themes or a series of ideas within a text, not just individual words, for example something that would explain why Louis’ character in Woolf’s novel was obsessed with Egypt.

Leave a Reply

Your email address will not be published. Required fields are marked *