Derrida’s Margins

Derrida’s Margins is a remarkable project: a digital unpacking of Jacques Derrida’s library, starting with De la grammatologie

Screen Shot 2018 12 15 at 7 54 28 AM

Screen Shot 2018 12 15 at 7 55 54 AM

It’s an astonishing resource — I’m going to spend hours and hours here. Ninety years ago, John Livingston Lowes published a book called The Road to Xanadu: A Study in the Ways of the Imagination, that purported to track down all the reading that had gone into the writing of Coleridge’s great poems “The Rime of the Ancient Mariner” and “Kubla Khan.” It was, and remains, a remarkable work of scholarship, though necessarily highly (and sometimes dubiously) speculative. What we have in Derrida’s Margins is a kind of digital extension of Lowes’s project — and that’s fitting, because Derrida is in some ways the Coleridge of modern literary theory: enormously learned, quirkily imaginative, sensitive to strange connectors and attractors. 

And I am especially pleased that the technical lead on this project is Rebecca Sutton Koeser, a former student of mine! Even as an undergrad Rebecca was equally interested in literary study and computer science; she went on to get her PhD in English from Emory, and has continued to straddle these two worlds. Here’s her account of what she learned in working on this project. 

Computational Humanities and Critical Tact

Natalie Kane writes,

When we talk about algorithms, we never talk about the person, we talk about the systems they operate on, the systems they facilitate, and then the eventual consequences that fall when they supposedly ‘malfunction’. Throwing this out into twitter, a friend and talented creative technologist Dan Williams succinctly reminded me that we should always be replacing the word ‘algorithms’ with ‘a set of instructions written by someone.’ They are made, and written, by human beings, just like all technology is (a general statement, I know, but it all began with us). By removing that element when talking about them we cease to have any understanding of the culture and social systems behind it, which arguably are the reasons why algorithms are having such a huge impact on our lives. If we’re going to have constructive conversations about the cultural and societal impact of algorithmically mediated culture, we have to always remember that there are humans in it, and not just at the other end of the black box feeling the vibrations. As Matthew Plummer-Fernandez pointed out in a panel on Data Doubles this year, the problems with confronting complex computation systems demands a socio-technical, not a purely technical solution.

This reminds me of a recent post I wrote about Andrew Piper’s current project and other projects in the digital (or computational) humanities. I’ve been thinking a lot about how to evaluate projects such as Andrew’s, and Kane’s post made me realize the extent to which I default to thinking about outcomes, results, answers — and not so much about the originating questions, the “set of instructions written by someone.”

Twenty years ago, Mark Edmundson commented that “Paul de Man is a more difficult critic to read and understand than a responsive, rather impressionistic essayist like Virginia Woolf. But one can teach intelligent students to do what de Man does; it is probably impossible to teach anyone to respond like Woolf if he has little aptitude for it.” Edmundson goes on to ask, “And how could we sustain the academic study of literature if we were compelled to say that a central aspect of criticism is probably not teachable?” — but while that’s an important question it’s not the one I want to pursue right now.

Rather, I’d like to focus on the indefinable skill he’s referring to here, which is, I think, what some people used to call critical tact — the knack for perceiving what themes or questions will, upon pursuit, yield fruitful thoughts. It seems to me that this tact is no less valuable in computational humanities work than in any other kind of critical work, and that we would do well to think about what the best questions look like, what shape or form they tend to take.

Edmundson is probably right to suggest that critical tact can’t be taught, but it can nevertheless be learned (no one is born with it). And maybe one way people in DH could learn it is by spending a lot of time thinking about and debating this: Who asks the best questions? Who writes the most provocative and fruitful instructions?

Two kinds of critics in the world….

I’m very interested in Andrew Piper’s new work on the “conversional novel”:

My approach consisted of creating measures that tried to identify the degree of lexical binariness within a given text across two different, yet related dimensions. Drawing on the archetype of narrative conversion in Augustine’s Confessions, my belief was that “conversion” was something performed through lexical change — profound personal transformation required new ways of speaking. So my first measure looks at the degree of difference between the language of the first and second halves of a text and the second looks at the relative difference within those halves to each other (how much more heterogenous one half is than another). As I show, this does very well at accounting for Augustine’s source text and it interestingly also highlights the ways in which novels appear to be far more binarily structured than autobiographies over the same time period. Counter to my initial assumptions, the ostensibly true narrative of a life exhibits a greater degree of narrative continuity than its fictional counterpart (even when we take into account factors such as point of view, length, time period, and gender).

Now — and here’s the really noteworthy part — Piper defines “conversional” largely in terms of lexical binaries. So a novel in which a religious conversion takes place — e.g., Heidi — is no more conversional than a novel whose pivot is not religious but rather involves a move from nature to culture or vice versa — e.g., White Fang.

As I say, I’m interested, but I wonder whether Piper isn’t operating here at a level of generality too high to be genuinely useful. Computational humanities in this vein strikes me as a continuation — a very close continuation — of the Russian formalist tradition: Vladimir Propp’s Morphology of the Folk Tale — especially the section on the 31 “functions” of the folk tale — seems to be a clear predecessor here, though perhaps the more radical simplifications of Greimas’s semiotic square are even more direct a model.

Two thoughts come to my mind at this point:

  1. I would love to see what would happen if some intrepid computational humanists got to work on filtering some large existing of corpus of texts through the categories posited by Propp, Greimas, et al. Were the ideas that emerged from that tradition arbitrary and factitious, or did they draw on genuine, though empirically unsupported, insights into how stories tend to work?
  2. A good many people these days emphasize the differences between traditional and computational humanists, but the linkage I have just traced between 20th-century formalists and Piper’s current work makes me wonder if the more important distinction isn’t Darwin’s opposition of lumpers and splitters — those who, like Piper in this project, are primarily interested in what links many works to one another versus those who prefer to emphasize the haecceity of an individual thing … which of course takes us back to the ancient universals/particulars debate, but still….

UPDATE: via Ted Underwood on Twitter, I see some people are already moving towards Propp-inspired ways of doing DH: see here and here.

Four years after the original Nature paper was published, Nature News had sad tidings to convey: the latest flu outbreak had claimed an unexpected victim: Google Flu Trends. After reliably providing a swift and accurate account of flu outbreaks for several winters, the theory-free, data-rich model had lost its nose for where flu was going. Google’s model pointed to a severe outbreak but when the slow-and-steady data from the CDC arrived, they showed that Google’s estimates of the spread of flu-like illnesses were overstated by almost a factor of two.

The problem was that Google did not know – could not begin to know – what linked the search terms with the spread of flu. Google’s engineers weren’t trying to figure out what caused what. They were merely finding statistical patterns in the data. They cared about ­correlation rather than causation. This is common in big data analysis. Figuring out what causes what is hard (impossible, some say). Figuring out what is correlated with what is much cheaper and easier. That is why, according to Viktor Mayer-Schönberger and Kenneth Cukier’s book, Big Data, “causality won’t be discarded, but it is being knocked off its pedestal as the primary fountain of meaning”.

But a theory-free analysis of mere correlations is inevitably fragile. If you have no idea what is behind a correlation, you have no idea what might cause that correlation to break down. One explanation of the Flu Trends failure is that the news was full of scary stories about flu in December 2012 and that these stories provoked internet searches by people who were healthy. Another possible explanation is that Google’s own search algorithm moved the goalposts when it began automatically suggesting diagnoses when people entered medical symptoms.

Google Flu Trends will bounce back, recalibrated with fresh data – and rightly so. There are many reasons to be excited about the broader opportunities offered to us by the ease with which we can gather and analyse vast data sets. But unless we learn the lessons of this episode, we will find ourselves repeating it.

Big data: are we making a big mistake? – FT.com Fantastic essay by Tim Harford. I have a feeling that again and again the big data firms will insist that the data can “speak for itself” — simply because data gathering is what they can do.

There should be a lesson here also for those who believe that Franco-Moretti-style “distant reading” can transform the humanities. The success of that endeavor too will be determined not by text-mining power but by the incisiveness of the questions asked of that data and shrewdness of conclusions drawn about it. What T. S. Eliot said long ago — “The only method is to be very intelligent” — remains just as true in an age of Big Data as it was when Eliot uttered it.

So Philology: to preserve, monitor, investigate, and augment our cultural inheritance, including the various material means by which it has been realized and transmitted. The scholar Aleida Assmann recently said of the philologian’s archive (as opposed to the critic’s and scholar’s canon) that it is populated by materials that “have lost their immediate addresses; they are decontextualized and disconnected from the former frames which had authorized them or determined their meaning. As part of the archive, they are open to new contexts and lend themselves to new interpretations.” So we may arrive at “counter-memories” and “history against the grain.” But in order to lie so open, they must be held at a level of attention that is consciously marked as useless and nonsignificant. There need be no perceived use or exchange value in the philological move to preserve beyond the act of preservation itself. Value is not only beyond present conception, it is understood that it may never again acquire perceived value. “Never again” is crucial. For the philologian, materials are preserved because their simple existence testifies that they once had value, though what that was we may not — may never — know. As the little girl in Wordsworth’s “We are Seven” understood, the dead are precious as such. If we are living human lives, they are not even dead.

I’ve increasingly felt that digital journalism and digital humanities are kindred spirits, and that more commerce between the two could be mutually beneficial. That sentiment was confirmed by the extremely positive reaction on Twitter to a brief comment I made on the launch of Knight-Mozilla OpenNews, including from Jon Christensen (of the Bill Lane Center for the American West at Stanford, and formerly a journalist), Shana Kimball (MPublishing, University of Michigan), Tim Carmody (Wired), and Jenna Wortham (New York Times).

Here’s an outline of some of the main areas where digital journalism and digital humanities could profitably collaborate. It’s remarkable, upon reflection, how much overlap there now is, and I suspect these areas will only grow in common importance.

Dan Cohen’s Digital Humanities Blog » Blog Archive » Digital Journalism and Digital Humanities. Dan’s list of points of convergence is extremely smart and extremely provocative.

I was just struck by this in recent discussions about instituting a no laptop policy in the classroom. It was so self-evident that maybe it’s just obvious, but I hadn’t thought about it in quite this way before. If you want to teach and run your class as you did in the nineties (or as others did in the nineties), then it makes sense to institute a no laptop policy (and obviously a no phone policy) in your classroom. Basically, in order to adopt those practices one has to recreate the technological-material conditions of the period. Now, as it turns out, those conditions in the nineties were much like the conditions throughout the 20th century, at least for humanities classes. The key materials of desks, chairs, lights, books, paper, pens, chalk, etc. had been functionally the same for decades. It’s a little naive, but still understandable, that one would misrecognize these historical conditions for absolute ones.

digital digs: the no laptop policy and the 1990s classroom.  Wrong, wrong, wrong. If I tell my students to put away their laptops while we’re in class, I am not telling them to repudiate laptops, or other technologies, altogether. My policy in recent years has been to press my students to experiment with a wide range of digital technologies in their research and writing, and in many cases to encourage them to use those technologies collaboratively — but when we’re in class, that’s book time. During the handful of hours that we’re together each week, the best use of that time, I think, is to work diligently with the technology of the book. Usually that means codices, but if it’s KIndles or Nooks, that’s fine too. But class time is book time; other times are for other technologies. Why assume so narrow a pedagogy that you only allow yourself and your students to use instruments that are just a few years old?