...

Taghumanities

reasons for decline

Alex Reid

From a national perspective, the number of people earning communications degrees (which was negligible in the heyday of English majors 50-60 years ago), surpassed the number getting English degrees around 20 years ago. Since then Communications has held a fairly steady share of graduates as the college population grew, while English has lost its share and in recent years even shrank in total number, as this NCES table records. In short, students voted with their feet and, for the most part, they aren’t interested in the curricular experience English has to offer (i.e. read books, talk about books, write essays about books). 

Scott Alexander

Peterson is very conscious of his role as just another backwater stop on the railroad line of Western Culture. His favorite citations are Jung and Nietzsche, but he also likes name-dropping Dostoevsky, Plato, Solzhenitsyn, Milton, and Goethe. He interprets all of them as part of this grand project of determining how to live well, how to deal with the misery of existence and transmute it into something holy.

And on the one hand, of course they are. This is what every humanities scholar has been saying for centuries when asked to defend their intellectual turf. “The arts and humanities are there to teach you the meaning of life and how to live.” On the other hand, I’ve been in humanities classes. Dozens of them, really. They were never about that. They were about “explain how the depiction of whaling in Moby Dick sheds light on the economic transformations of the 19th century, giving three examples from the text. Ten pages, single spaced.” 

So maybe — just maybe — it’s not “read books, talk about books, write essays about books” that’s the problem. 

(Cross-posted at Text Patterns) 

speaking truth to the professoriate

I’m probably going to write more about Scott Alexander’s take on Jordan Peterson, but for now I just want to note that he’s absolutely right to say that Peterson is playing a role that (a) most of us who teach the humanities say we already fill but (b) few of us care about at all:

Peterson is very conscious of his role as just another backwater stop on the railroad line of Western Culture. His favorite citations are Jung and Nietzsche, but he also likes name-dropping Dostoevsky, Plato, Solzhenitsyn, Milton, and Goethe. He interprets all of them as part of this grand project of determining how to live well, how to deal with the misery of existence and transmute it into something holy.

And on the one hand, of course they are. This is what every humanities scholar has been saying for centuries when asked to defend their intellectual turf. “The arts and humanities are there to teach you the meaning of life and how to live.” On the other hand, I’ve been in humanities classes. Dozens of them, really. They were never about that. They were about “explain how the depiction of whaling in Moby Dick sheds light on the economic transformations of the 19th century, giving three examples from the text. Ten pages, single spaced.” And maybe this isn’t totally disconnected from the question of how to live. Maybe being able to understand this kind of thing is a necessary part of being able to get anything out of the books at all.

But just like all the other cliches, somehow Peterson does this better than anyone else. When he talks about the Great Works, you understand, on a deep level, that they really are about how to live. You feel grateful and even humbled to be the recipient of several thousand years of brilliant minds working on this problem and writing down their results. You understand why this is all such a Big Deal.

“Shakespeareomics”

Towards the end of last year I saw Benedict Cumberbatch playing Hamlet and then Justin Kurzel’s fantastic film version of Macbeth. In chatting about the performances with literary friends afterwards, I was struck by the similarities in research methodology employed in the humanities and science. Literary scholars dissect texts seeking clues which they reinforce with meta-data coming from biographical details of the author and historical facts about contemporary knowledge and beliefs.

“Shakespeareomics” – how scientists are unlocking the secrets of the Bard. Ah. So that’s what we do.

in which I sum up my posts on the recent controversies in academia

I have been trying for a while now, and in multiple locations, to articulate an argument about recent modes of student disaffection in American universities. I think there is a bright, strong thread linking the “trigger warning” debates of last year with the student protests of this year. In an ideal world I’d turn these thoughts into a short book, or at least a very long article, but for now I’m just going to have to link the posts together into a virtual unity.

I began by discussing the way the upbringing of today’s students may have encouraged them to think that the core function of adults, including their teachers and university administrators, is to protect them from discomfort.

I then argued that when these expectations are thwarted, or seem to be thwarted, students can become frustrated very quickly if they do not have good reason to trust their teachers; this is a primary cause of the demand for trigger warnings.

And that mistrust is exacerbated by the fact that, in general, American universities do not present themselves as places where one goes to seek wisdom, but as places where one goes to get credentials for future career success — a message students have received very clearly.

So when the universities seem not to be living up to their neoliberal promises, angry students don’t think of this as a situation that calls for political protests of the Sixties variety; rather, they are consumers upset about the product they have purchased, so they bypass the lower-level staff and complain to the managers.

And the managers (i.e. administrators) respond the way managers always respond when the customers complain.

But this is not an adequate response. Administrators and professors alike need to recall that one of their key tasks is to organize the university as a kind of mediating or transitional space between the Home and the Wide World that encourages students to develop a genuine public individuality.

This developmental process is not and cannot be perfectly safe: many of students’ core beliefs about self and world will come under challenge. But it can be done in a healthy way, as long as fears are properly acknowledged and dealt with; however, to return to an earlier theme, fear of harm can only be overcome when students have good reason to trust those who teach them.

As long as fear is greater than trust, it will be extremely difficult, if not impossible, to convince students that disagreement about foundational social and moral issues is not only acceptable, it is invaluable to individual and society alike. But to insist on this truth is the sine qua non of the current academic moment.

For this task, this insistence that there is something more and better than policing disagreement and building walls of separation between us and those who don’t see things our way, the humanities are invaluable: but they must recover some of their old moral robustness and commitment to the sovereign virtue of compassion.

If we want to get past this impasse of hostility and suspicion, we must remind ourselves, and then teach our students, that together we can travel better paths than that of neoliberal contractualism, which leads inevitably to code fetishism. We need not be such Baconian rationalists, such Weberian bureaucrats; and if we insist on living like that, if we forget that “there’s got to be a better way for people to live,” then all we have to look forward to is the academic equivalent of the shootout at the end of High Noon. But here in the real world there’s no way to tell who might win — if anyone does.

the last word on the humanities

The late Robert Nisbet used to tell a story about the academic scientist who was angrily accosted by his humanist colleague for “speaking against the humanities” at the previous day’s faculty meeting. Au contraire, said the other; he was doing nothing of the kind. “I love the humanities! I would die for the humanities! All I asked was — what the hell are the humanities?”

– “Defining the Humanities Up” by Wilfred M. McClay | Articles | First Things

But persistent or not, the myth of the unemployed humanities major is just that: a myth, and an easily disproven one at that. Georgetown University’s Center on Education and the Workforce has been tracking differences in the employment of graduates from various disciplines for years, demonstrating that all graduates see spikes and troughs in their employment prospects with the changing economy. And AAC&U’s employer surveys confirm, year after year, that the skills employers value most in the new graduates they hire are not technical, job-specific skills, but written and oral communication, problem solving, and critical thinking—exactly the sort of “soft skills” humanities majors tend to excel in.

Pax Scientia: Thanks, But I’ll Pass

Armand Marie Leroi is an evolutionary biologist — and also a scientific imperialist. No, that’s not an insult: it’s his own account of the matter.

Now, to be sure, Leroi says that in the conflict between science and the humanities “Hard words such as ‘imperialism,’ ‘scientism,’ and ‘vaulting ambition’ will be flung about,” because such words belong to “the vocabulary of anti-science.” But in the very same paragraph he claims that the only choices for the humanities are to pursue “a new kulturkampf” that they cannot win — because they are “weakened” by internal conflicts — or to “gratefully accept the peace imposed by science.” The really interesting word there is “imposed”: science is not offering peace, it is imposing it. Looks like for us humanists it’s Hobson’s choice.

And lest we think that that talk of “imposing” was an infelicitous turn of phrase, Leroi immediately extends it: “Under the Pax Scientia criticism will continue, but be tamed.” The imperium of science, or perhaps I should write Science, is today’s successor to that of Rome.

In Virgil’s Aeneid, the hero Aeneas descends into the underworld and meets the ghost of his father, who prophesies to him about the future of Rome. The “arts” of the Romans will be pacisque imponere morem, parcere subiectis, et debellare superbos — as Allen Mandelbaum renders it, “to teach the ways of peace to those you conquer, / to spare defeated peoples, tame the proud.” The language of taming in Leroi’s essay seems scarcely accidental.

So imperialism it is, then. I suppose I am supposed to be thankful that Leroi, in his great magnanimity, allows a barbarian, or perhaps a slave, like me to continue to do my work under the minatory tutelage of Science — especially since the alternative, I guess, is to end up like Spartacus and his fellow rebels. (That anti-Roman kulturkampf wasn’t such a great idea, guys.) After all, to offer any resistance whatsoever to the new imperium is to be “anti-science.”

spartacuscross

the last humanist

Now, to Leroi’s credit, he understands, at least in a rudimentary way, that the kind of criticism often practiced by humanists differs pretty strongly from what can be revealed by running the numbers: “When Edmund Wilson tells us that Sophocles’s ‘Philoctetes’ is a parable on the association between deformity and genius; or when Arthur Danto says that Mark Rothko’s ‘Untitled (1960)’ is simply about beauty, then we are, it seems, in a realm of understanding where numbers, and the algorithms that produce them, have no dominion.” (Though even here he seems to forget that algorithms don’t emerge ex nihilo but are written by people.)

But Leroi doesn’t seem to grasp that much criticism — and much of the criticism that has mattered the most — isn’t concerned with assigning a one-phrase summary of the “meaning” of an entire work of art, but is rather intensely focused on the details that are too small and too distinctive for algorithmic attention. When Keats writes, “Now more than ever seems it rich to die,” what does “rich” connote? Might it be ironic? (After all, the ironic use of “rich” — “Oh, that’s rich” — goes back to the seventeenth century.) No algorithm can ever tell, because algorithms aggregate, and the question here is about a single unrepeatable instance of a word. Nor can any aggregated information tell us anything about the torn cloth at the elbow of the disciple in Caravaggio’s Supper at Emmaus, or the bizarre alternations of the madly driven rhythms and ethereal voices in the Confutatis of Mozart’s Requiem.

All this is not to say that “distant reading” isn’t valuable — it is, and I have defended the work of digital humanists who work algorithmically against know-nothing critiques — but rather that it’s not the only kind of humanistic work that’s valuable, and that critics who attend to the specific and unrepeatable are doing, and will continue to do, intellectually serious work.

Maybe they’ll be paid for that work in the future; maybe not. People who care about such things will still continue to attend to it, whether their overlords like it or not. An essay like Leroi’s is written by people who have access to money that humanists can’t dream or, who expect to have access to that money forever, and who think it gives them imperial powers.

In a famous essay George Orwell wrote about the headmaster of his old prep school who would say to charity students like Orwell, “You are living on my bounty!” — that seems to be Leroi’s attitude toward humanists. But sorry, I’m not accepting the terms of peace Leroi would dictate — and I don’t think he can impose them after all. The war between Apollo and Hermes will continue.

And one more thing: that Roman imperium, that Pax Romana? They thought that would last forever too.

Computational Humanities and Critical Tact

Natalie Kane writes,

When we talk about algorithms, we never talk about the person, we talk about the systems they operate on, the systems they facilitate, and then the eventual consequences that fall when they supposedly ‘malfunction’. Throwing this out into twitter, a friend and talented creative technologist Dan Williams succinctly reminded me that we should always be replacing the word ‘algorithms’ with ‘a set of instructions written by someone.’ They are made, and written, by human beings, just like all technology is (a general statement, I know, but it all began with us). By removing that element when talking about them we cease to have any understanding of the culture and social systems behind it, which arguably are the reasons why algorithms are having such a huge impact on our lives. If we’re going to have constructive conversations about the cultural and societal impact of algorithmically mediated culture, we have to always remember that there are humans in it, and not just at the other end of the black box feeling the vibrations. As Matthew Plummer-Fernandez pointed out in a panel on Data Doubles this year, the problems with confronting complex computation systems demands a socio-technical, not a purely technical solution.

This reminds me of a recent post I wrote about Andrew Piper’s current project and other projects in the digital (or computational) humanities. I’ve been thinking a lot about how to evaluate projects such as Andrew’s, and Kane’s post made me realize the extent to which I default to thinking about outcomes, results, answers — and not so much about the originating questions, the “set of instructions written by someone.”

Twenty years ago, Mark Edmundson commented that “Paul de Man is a more difficult critic to read and understand than a responsive, rather impressionistic essayist like Virginia Woolf. But one can teach intelligent students to do what de Man does; it is probably impossible to teach anyone to respond like Woolf if he has little aptitude for it.” Edmundson goes on to ask, “And how could we sustain the academic study of literature if we were compelled to say that a central aspect of criticism is probably not teachable?” — but while that’s an important question it’s not the one I want to pursue right now.

Rather, I’d like to focus on the indefinable skill he’s referring to here, which is, I think, what some people used to call critical tact — the knack for perceiving what themes or questions will, upon pursuit, yield fruitful thoughts. It seems to me that this tact is no less valuable in computational humanities work than in any other kind of critical work, and that we would do well to think about what the best questions look like, what shape or form they tend to take.

Edmundson is probably right to suggest that critical tact can’t be taught, but it can nevertheless be learned (no one is born with it). And maybe one way people in DH could learn it is by spending a lot of time thinking about and debating this: Who asks the best questions? Who writes the most provocative and fruitful instructions?

On Not Defending the Humanities

I don’t think that the humanities or the liberal arts can be defended, at least not in the sense that most people give to “defended.” Here’s why, starting with three texts on which I will build my explanation.

The English theologian Austin Farrer used to say that some Christian doctrines — he was thinking especially of the hypostatic union — cannot be defended, but can be homiletically expounded.

Similarly, in Whose Justice? Which Rationality? Alasdair MacIntyre writes,

In systematizing and ordering the truths they take themselves to have discovered, the adherents of a tradition may well assign a primary place in the structures of their theorizing to certain truths and treat them as first metaphysical or practical principles. But such principles will have had to vindicate themselves in the historical process of dialectical justification… Such first principles themselves, and indeed the whole body of theory of which they are a part, themselves will be understood to require justification. The kind of rational justification which they receive is at once dialectical and historical. They are justified insofar as in the history of this tradition they have, by surviving the process of dialectical questioning, vindicated themselves as superior to their historical predecessors.

And in The Abolition of Man C. S. Lewis writes,

Those who understand the spirit of the Tao and who have been led by that spirit can modify it in directions which that spirit itself demands. Only they can know what those directions are. The outsider knows nothing about the matter. His attempts at alteration, as we have seen, contradict themselves. So far from being able to harmonize discrepancies in its letter by penetration to its spirit, he merely snatches at some one precept, on which the accidents of time and place happen to have riveted his attention, and then rides it to death — for no reason that he can give. From within the Tao itself comes the only authority to modify the Tao. This is what Confucius meant when he said ‘With those who follow a different Way it is useless to take counsel’. This is why Aristotle said that only those who have been well brought up can usefully study ethics: to the corrupted man, the man who stands outside the Tao, the very starting point of this science is invisible. He may be hostile, but he cannot be critical: he does not know what is being discussed…. Outside the Tao there is no ground for criticizing either the Tao or anything else.

I think that these passages, read rightly, suggest a few things. First, that it’s probably impossible to defend the artes liberales, or the studia humanitatis, to people who are firmly outside their Tao — who simply do not acknowledge the value of the foundational commitments that have shaped the tradition. The person who relentlessly demands to know what kind of job a liberal-arts education will get him “may be hostile, but he cannot be critical.” And this is not a temporary or trivial impediment that can be maneuvered around; it’s an immoveable object.

But apologetics is not the only mode of suasion. In some cases the more appropriate rhetoric relies on narration and exposition: perhaps something as simple as telling the story of what we do. There are many places around the country where older models of the humanities are flourishing, even as those who have rejected this Tao are floundering — those older models having, as it were, vindicated themselves as superior to their historical successors, or would-be successors. But what happens in these institutions, in these classrooms, is simply invisible to people who question the value of the humanities. Rendering the invisible visible might be one of the best services those of us following those models could perform in “defense” of our practices.

I’m teaching a class called Philosophy Versus Literature and right now we’re working through Lucretius. Yesterday we talked about philosophy as therapy — drawing on Martha Nussbaum, among others — and the odd but, in the end, strong logic that leads Lucretius (following Epicurus) to believe that physics is first philosophy, that understanding the constitution of the world is the necessary first step towards being liberated from fear and unnecessary pain. Next time I’m going to tell the students about Stephen Greenblatt’s (very bad) book The Swerve, and ask them why a book about the early modern recovery of Lucretius became a bestseller and award-winner in 21st-century America. These conversations concern matters ancient and permanent and are also about as relevant as relevant can be, if that happens to be one of your criteria for educational value. Moreover, De Rerum Natura is a book that runs pretty strongly against the grain of all that my students believe and hope — but that does not deter them nor me from treating it with the utmost attentiveness.

I can’t defend the value of this kind of exercise in some kind of abstract, characterless intellectual space; nor am I inclined to. It makes sense within the Tao; and if you want to see what kind of sense it makes you have to think and act within the Tao. You have to take it on as a living option, not a dessicated proposition. To those who doubt the value of what I do, I probably have little more to say than: Taste and see. (But I’ll use more words to say it.)

Two kinds of critics in the world….

I’m very interested in Andrew Piper’s new work on the “conversional novel”:

My approach consisted of creating measures that tried to identify the degree of lexical binariness within a given text across two different, yet related dimensions. Drawing on the archetype of narrative conversion in Augustine’s Confessions, my belief was that “conversion” was something performed through lexical change — profound personal transformation required new ways of speaking. So my first measure looks at the degree of difference between the language of the first and second halves of a text and the second looks at the relative difference within those halves to each other (how much more heterogenous one half is than another). As I show, this does very well at accounting for Augustine’s source text and it interestingly also highlights the ways in which novels appear to be far more binarily structured than autobiographies over the same time period. Counter to my initial assumptions, the ostensibly true narrative of a life exhibits a greater degree of narrative continuity than its fictional counterpart (even when we take into account factors such as point of view, length, time period, and gender).

Now — and here’s the really noteworthy part — Piper defines “conversional” largely in terms of lexical binaries. So a novel in which a religious conversion takes place — e.g., Heidi — is no more conversional than a novel whose pivot is not religious but rather involves a move from nature to culture or vice versa — e.g., White Fang.

As I say, I’m interested, but I wonder whether Piper isn’t operating here at a level of generality too high to be genuinely useful. Computational humanities in this vein strikes me as a continuation — a very close continuation — of the Russian formalist tradition: Vladimir Propp’s Morphology of the Folk Tale — especially the section on the 31 “functions” of the folk tale — seems to be a clear predecessor here, though perhaps the more radical simplifications of Greimas’s semiotic square are even more direct a model.

Two thoughts come to my mind at this point:

  1. I would love to see what would happen if some intrepid computational humanists got to work on filtering some large existing of corpus of texts through the categories posited by Propp, Greimas, et al. Were the ideas that emerged from that tradition arbitrary and factitious, or did they draw on genuine, though empirically unsupported, insights into how stories tend to work?
  2. A good many people these days emphasize the differences between traditional and computational humanists, but the linkage I have just traced between 20th-century formalists and Piper’s current work makes me wonder if the more important distinction isn’t Darwin’s opposition of lumpers and splitters — those who, like Piper in this project, are primarily interested in what links many works to one another versus those who prefer to emphasize the haecceity of an individual thing … which of course takes us back to the ancient universals/particulars debate, but still….

UPDATE: via Ted Underwood on Twitter, I see some people are already moving towards Propp-inspired ways of doing DH: see here and here.

The University of Chicago Press is pleased to announce the launch of History of Humanities, a new journal devoted to the historical and comparative study of the humanities. The first issue will be published in the spring of 2016.

History of Humanities, along with the newly formed Society for the History of the Humanities, takes as its subject the evolution of a wide variety of disciplines including archaeology, art history, historiography, linguistics, literary studies, musicology, philology, and media studies, tracing these fields from their earliest developments, through their formalization into university disciplines, and to the modern day. By exploring these subjects across time and civilizations and along with their socio-political and epistemic implications, the journal takes a critical look at the concept of humanities itself.

Chicago to Publish New Journal: History of Humanities. I’m quite interested in this journal and look forward to reading it, but NB: of the 49 (!) Editors and Associate Editors, there is only one scholar of religion — a professor of Islamic Intellectual History — and no one in biblical studies or theology. And yet those disciplines have had some role to play in the history of the humanities, I dare say.

No matter the text that he’s ostensibly engaged with, Mr. Keating, like Hamlet in Stéphane Mallarmé’s wonderful description, is forever “reading in the book of himself.” This is what Keating’s namesake John Keats (referencing Wordsworth) called the “egotistical sublime.” Recently, some pioneering work in neuroscience has begun to suggest what English teachers have long known: that the power of literature is the power of alterity, creating the possibility of encountering the other in a form not easily recuperable, not easily assimilable to the self. “Imaginative sympathy,” we used to call it. To read literature well is to be challenged, and to emerge changed.

But for Keating, it’s the text (like Frost’s poem) that is changed, not the reader. He does the same thing to the Whitman poem “O Me! O Life!” that he recites to his students. Used as the voiceover for a recent iPad ad, Mr. Keating’s pep talk quotes the opening and closing lines of the poem, silently eliding the middle: “Oh me! Oh life! / of the questions of these recurring, / Of the endless trains of the faithless, of cities fill’d with the foolish, /…/ What good amid these, O me, O life? // Answer. // That you are here—that life exists and identity, / That the powerful play goes on, and you may contribute a verse.” He’s quoting from Whitman, he says, but the first line he omits is telling: “Of myself forever reproaching myself, (for who more foolish than I, and who more faithless?).” Go back and add that line to the quotation and see how it alters the whole. For Keating—and one fears, examining the scant evidence the film provides, for his students—every poem is a Song of Myself. This, then, is what’s at stake in Keating’s misreadings—I’m not interested simply in catching a fictional teacher out in an error. But he misreads both Frost and Whitman in such a way that he avoids precisely that encounter with the other, finding in poetry only an echo of what he already knows—what he’s oft thought, but ne’er so well expressed.

Dead Poets Society Is a Terrible Defense of the Humanities. The funny thing is, I think the movie actually demonstrates some of what’s wrong with Keating’s approach, but I don’t think I’ve ever met a fan of the film who notices that.

my course on the “two cultures”

FOTB (Friends Of This Blog), I have a request for you. This fall I’m teaching a first-year seminar for incoming Honors College students, and our topic is the Two Cultures of the sciences and the humanities. We’ll begin by exploring the lecture by C. P. Snow that kicked off the whole debate — or rather, highlighted and intensified a debate had already been going on for some time — and the key responses Snow generated (F. R. Leavis, Lionel Trilling, Loren Eiseley). We’ll also read the too-neglected book that raised many of the same issues in more forceful ways, and a few years before Snow, Jacob Bronowski’s Science and Human Values.

Then we’ll go back to try to understand the history of the controversy before moving forward to consider the forms it is taking today. Most of the essays I’ll assign may be found by checking out the “twocultures” tag of my Pinboard bookmarks, but we’ll also be taking a detour into science/religion issues by considering Stephen Jay Gould’s idea of non-overlapping magisteria and some of the responses to it.

What other readings should I consider? I am a bit concerned that I am presenting this whole debate as one conducted by white Western men — Are there ways of approaching these questions by women or people from other parts of the world that might put the issues in a different light? Please make your recommendations in the comments below or on Twitter.

Thanks!

Reposted from Text Patterns

Four years after the original Nature paper was published, Nature News had sad tidings to convey: the latest flu outbreak had claimed an unexpected victim: Google Flu Trends. After reliably providing a swift and accurate account of flu outbreaks for several winters, the theory-free, data-rich model had lost its nose for where flu was going. Google’s model pointed to a severe outbreak but when the slow-and-steady data from the CDC arrived, they showed that Google’s estimates of the spread of flu-like illnesses were overstated by almost a factor of two.

The problem was that Google did not know – could not begin to know – what linked the search terms with the spread of flu. Google’s engineers weren’t trying to figure out what caused what. They were merely finding statistical patterns in the data. They cared about ­correlation rather than causation. This is common in big data analysis. Figuring out what causes what is hard (impossible, some say). Figuring out what is correlated with what is much cheaper and easier. That is why, according to Viktor Mayer-Schönberger and Kenneth Cukier’s book, Big Data, “causality won’t be discarded, but it is being knocked off its pedestal as the primary fountain of meaning”.

But a theory-free analysis of mere correlations is inevitably fragile. If you have no idea what is behind a correlation, you have no idea what might cause that correlation to break down. One explanation of the Flu Trends failure is that the news was full of scary stories about flu in December 2012 and that these stories provoked internet searches by people who were healthy. Another possible explanation is that Google’s own search algorithm moved the goalposts when it began automatically suggesting diagnoses when people entered medical symptoms.

Google Flu Trends will bounce back, recalibrated with fresh data – and rightly so. There are many reasons to be excited about the broader opportunities offered to us by the ease with which we can gather and analyse vast data sets. But unless we learn the lessons of this episode, we will find ourselves repeating it.

Big data: are we making a big mistake? – FT.com Fantastic essay by Tim Harford. I have a feeling that again and again the big data firms will insist that the data can “speak for itself” — simply because data gathering is what they can do.

There should be a lesson here also for those who believe that Franco-Moretti-style “distant reading” can transform the humanities. The success of that endeavor too will be determined not by text-mining power but by the incisiveness of the questions asked of that data and shrewdness of conclusions drawn about it. What T. S. Eliot said long ago — “The only method is to be very intelligent” — remains just as true in an age of Big Data as it was when Eliot uttered it.

Defenders of the humanities claim a special role in training citizens for a democratic society and often have deeply felt convictions about democratizing knowledge and including new voices. The mainstream of humanities research has, however, focused upon virtuoso scholarship, published in subscription publications to which only academics have access, and composed for small networks of specialists who write letters for tenure and promotion and who meet each other for lectures and professional gatherings. Students in the humanities remain, to a very large degree, subjects of a bureaucracy of information, where they have no independent voice and where they never move beyond achieving goals narrowly defined by others. The newly re-engineered sciences have reorganized themselves to give students a substantive voice in the development of knowledge and to become citizens in a republic of learning. The STEM disciplines certainly have not fully realized these lofty ideals but they are far ahead of most of their colleagues in the humanities and in Greek and Latin studies.

For Thackeray and his contemporaries, literature is a public matter, a matter to be lectured upon before large audiences, a matter to be given importance because of its impact upon morals and emotions. For the present-day academic critic, literature no longer is a public matter but rather is a professional matter, even more narrowly, a departmental matter. The study of literature has become a special and separate discipline — housed in colleges of arts and sciences along with other special and separate disciplines. The public has narrowed to a group of frequently recalcitrant students whose need for instruction in English composition — not in English literature — justifies the existence of the English department… . People who want to become English professors do so because, at one point in their lives, they found reading a story, poem, or play to be an emotionally rewarding experience. They somehow, someway were touched by what they read. Yet it is precisely this emotional response that the would-be professor must give up. Of course, the professor can and should have those feelings in private, but publicly, as a teacher or publisher, the professor must talk about the text in nonemotional, largely technical terms. No one ever won a National Endowment for the Humanities grant by weeping copiously for Little Nell, and no one will get tenure in a major department by sharing his powerful feelings about Housman’s Shropshire Lad with the full professors.

— Brian McRae, from Addison and Steele Are Dead (1991)

The other thing for which I am grateful to philosophy is that, at least in the world in which I first sought to make a name for myself, one was required to write clearly, concisely, and logically. Wittgenstein said that whatever can be said can be said clearly, and that became something of a mantra for my generation. At one time, the British journal Analysis sponsored regular competitions: some senior philosopher propounded a problem, which one was required to solve in 600 words or less, the winner receiving as a prize a year’s subscription to the magazine. Here is an example of the kind of problem, propounded by J. L. Austin, that engaged Analysis’s subscribers: “What kind of ‘if ’ is the ‘if ’ in ‘I can if I choose?’ ” (Hint: it cannot be the truth-conditional “if ” of material implication, as in, “If p, then q.”)

I tried answering all the problems, and never won a prize. But the exercise taught me how to write. The great virtues of clarity, concision, and coherence, insisted upon throughout the Anglo-American philosophical community, have immunized the profession against the stylistic barbarity of Continental philosophy, which, taken up as it has been since the early 1970s by the humanistic disciplines—by literary theory, anthropology, art history, and many others—has had a disastrous effect, especially on academic culture, severely limiting the ability of those with advanced education to contribute to the intellectual needs of our society. It is true that analytical philosophers, reinforced by the demands of their profession to work within their constricting horizons, have not directly served society by applying their tools to the densely knotted problems of men, to use Dewey’s term for where the energies of philosophy should be directed. At one point it became recognized that “clarity is not enough.” It is not enough. But the fact that it remains a stylistic imperative in most Anglo-American philosophy departments means that these virtues are being kept alive against the time when the humanities need to recover them.

— Arthur C. Danto (available to subscribers only, I think)

So Philology: to preserve, monitor, investigate, and augment our cultural inheritance, including the various material means by which it has been realized and transmitted. The scholar Aleida Assmann recently said of the philologian’s archive (as opposed to the critic’s and scholar’s canon) that it is populated by materials that “have lost their immediate addresses; they are decontextualized and disconnected from the former frames which had authorized them or determined their meaning. As part of the archive, they are open to new contexts and lend themselves to new interpretations.” So we may arrive at “counter-memories” and “history against the grain.” But in order to lie so open, they must be held at a level of attention that is consciously marked as useless and nonsignificant. There need be no perceived use or exchange value in the philological move to preserve beyond the act of preservation itself. Value is not only beyond present conception, it is understood that it may never again acquire perceived value. “Never again” is crucial. For the philologian, materials are preserved because their simple existence testifies that they once had value, though what that was we may not — may never — know. As the little girl in Wordsworth’s “We are Seven” understood, the dead are precious as such. If we are living human lives, they are not even dead.

Space colonies. That’s the latest thing you hear, from the heralds of the future. President Gingrich is going to set up a state on the moon. The Dutch company Mars One intends to establish a settlement on the Red Planet by 2023. We’re heading towards a “multi-planetary civilization,” says Elon Musk, the CEO of SpaceX. Our future lies in the stars, we’re even told.

As a species of megalomania, this is hard to top. As an image of technological salvation, it is more plausible than the one where we upload our brains onto our computers, surviving forever in a paradise of circuitry. But not a lot more plausible. The resources required to maintain a colony in space would be, well, astronomical. People would have to be kept alive, indefinitely, in incredibly inhospitable conditions. There may be planets with earthlike conditions, but the nearest ones we know about, as of now, are 20 light years away. That means that a round trip at 10 percent the speed of light, an inconceivable rate (it is several hundred times faster than anything we’ve yet achieved), would take 400 years.

But never mind the logistics. If we live long enough as a species, we might overcome them, or at least some of them: energy from fusion (which always seems to be about 50 years away) and so forth. Think about what life in a space colony would be like: a hermetically sealed, climate-controlled little nothing of a place. Refrigerated air, synthetic materials, and no exit. It would be like living in an airport. An airport in Antarctica. Forever. When I hear someone talking about space colonies, I think, that’s a person who has never studied the humanities. That’s a person who has never stopped to think about what it feels like to go through an average day—what life is about, what makes it worth living, what makes it endurable. A person blessed with a technological imagination and the absence of any other kind.

I had been interviewing her for the book I wrote about the Holocaust, and by the time I had the brief conversation I am about to tell you about, I knew of some of the things she had suffered, and I can tell you that they were the kind of things that strip all sentimentality away, that do not leave you with any mushy illusions about the value of Western culture, of “civilization” and its traditions. I just want you to know that before I tell you what she told me, which was this:

We had spent a long day together, talking about sad things — talking, in fact, about the erasure of a culture, the unmaking of a civilization, as total and final as what Vergil described in Book 2 of the Aeneid; there are, after all, as many Jews left today in the city this lady had lived in as there are Trojans in Troy. We were tired, night was falling. Because I didn’t want to leave her with sad thoughts that night, any more than I want to leave you with sad thoughts today, I tried to lead the conversation to a happier time.

“So what happened when the war was over?” I asked softly. “What was the first thing that happened, once things started to be normal again?”

The old lady, whose real name had disappeared in the war along with her parents, her house, and nearly everything else she had known, was now called Mrs. Begley. When I asked her this question Mrs. Begley looked at me; her weary expression had kindled, every so slightly. “You know, it’s a funny thing,” she told me. “When the Germans first came, in ‘41, the first thing they did was close the theaters.”

“The theaters?,” I echoed, a bit confused, not dreaming where this could be leading. I didn’t know which theaters she meant; I thought, briefly, of the great Beaux Arts Pantheon of an opera house in Lwów, with its Muse-drawn chariots and gilded victories, a mirage of civilization, the 19th century’s dream of itself. She had told me, once, that she had seen Carmen there, when she was a newlywed. But she wasn’t talking, now, about operas in Lwów, or about the 1930s; she was talking about theaters in Kraków, where she and her husband and child ended up once the war was over.

“Yes,” she said, sharply, as if it ought to have been obvious to me that the first thing you’d do, if you were about to end a civilization, would be to put an end to playgoing, “the theaters. The first thing they did was close the theaters. And I’ll tell you something, because I remember it quite clearly: the first thing that happened, after the war was over and things got a little normal — the first thing was that the actors and theater people who were still alive got together and put on, in Polish, a production of Sophocles’ Antigone.”

— UC Berkeley Classics Department: 2009 Commencement Address by Daniel Mendelsohn. Please, please read the whole thing, if you have any interest in what makes the humanities human.

Well, I’m basically a literary and philosophical humanist myself, not a journalist or scholar or expert of any kind, so I do personally regret that people like me don’t have and never again will have the cultural authority that the New York intellectuals had. But history has moved on, and there’s still a place, after all, for us humanists to practice the honorable activity of applying the really matchless moral resources of literature and the philosophical tradition to criticizing society and culture. Still, the work on the front lines now needs to be done by others, the investigative journalists and maverick scholars, people who can do deep reporting or­­­­­­­­ work in the archives. Those are activities that classical public intellectuals—Bourne, Russell, Camus, Sartre, Silone, Nicola Chiaromonte—didn’t have the time or the temperament for. So even though I personally will never be a Glenn Greenwald or a Noam Chomsky, I’m supremely grateful to them. They’re doing what I think needs above all to be done. Cultivating and expounding the humanities will always be essential to the moral health of society. But politically speaking, the literary intellectual is now a kind of auxiliary

George Scialabba. Someday I’ll find time to explain why I don’t agree.

In English we speak about science in the singular, but both French and German wisely retain the plural. The enterprises that we lump together are remarkably various in their methods, and also in the extent of their successes. The achievements of molecular engineering or of measurements derived from quantum theory do not hold across all of biology, or chemistry, or even physics. Geophysicists struggle to arrive at precise predictions of the risks of earthquakes in particular localities and regions. The difficulties of intervention and prediction are even more vivid in the case of contemporary climate science: although it should be uncontroversial that the Earth’s mean temperature is increasing, and that the warming trend is caused by human activities, and that a lower bound for the rise in temperature by 2200 (even if immediate action is taken) is two degrees Celsius, and that the frequency of extreme weather events will continue to rise, climatology can still issue no accurate predictions about the full range of effects on the various regions of the world. Numerous factors influence the interaction of the modifications of climate with patterns of wind and weather, and this complicates enormously the prediction of which regions will suffer drought, which agricultural sites will be disrupted, what new patterns of disease transmission will emerge, and a lot of other potential consequences about which we might want advance knowledge. (The most successful sciences are those lucky enough to study systems that are relatively simple and orderly. James Clerk Maxwell rightly commented that Galileo would not have redirected the physics of motion if he had begun with turbulence rather than with free fall in a vacuum.)

The emphasis on generality inspires scientific imperialism, conjuring a vision of a completely unified future science, encapsulated in a “theory of everything.” Organisms are aggregates of cells, cells are dynamic molecular systems, the molecules are composed of atoms, which in their turn decompose into fermions and bosons (or maybe into quarks or even strings). From these facts it is tempting to infer that all phenomena—including human actions and interaction—can “in principle” be understood ultimately in the language of physics, although for the moment we might settle for biology or neuroscience. This is a great temptation. We should resist it. Even if a process is constituted by the movements of a large number of constituent parts, this does not mean that it can be adequately explained by tracing those motions

The Trouble with Scientism. An incredibly important and provocative essay by Philip Kitcher.

I’ve increasingly felt that digital journalism and digital humanities are kindred spirits, and that more commerce between the two could be mutually beneficial. That sentiment was confirmed by the extremely positive reaction on Twitter to a brief comment I made on the launch of Knight-Mozilla OpenNews, including from Jon Christensen (of the Bill Lane Center for the American West at Stanford, and formerly a journalist), Shana Kimball (MPublishing, University of Michigan), Tim Carmody (Wired), and Jenna Wortham (New York Times).

Here’s an outline of some of the main areas where digital journalism and digital humanities could profitably collaborate. It’s remarkable, upon reflection, how much overlap there now is, and I suspect these areas will only grow in common importance.

Dan Cohen’s Digital Humanities Blog » Blog Archive » Digital Journalism and Digital Humanities. Dan’s list of points of convergence is extremely smart and extremely provocative.

Humanists can be private educators and public spies. But the latter role is far too rare, because humanist intellectuals do not see themselves as practitioners of daily life. Their disparagement comes largely from their own isolation within the institutions that reproduce them, a fate many humanists despise out of one side of their mouths while endorsing it with the other. The humanist corner of the university becomes, in Palmquist’s words, “just a safe haven for half-witted thinkers to make a comfortable living.”

The humanities needs more courage and more contact with the world. It needs to extend the practice of humanism into that world, rather than to invite the world in for tea and talk of novels, only to pat itself on the collective back for having injected some small measure of abstract critical thinking into the otherwise empty puppets of industry. As far as indispensability goes, we are not meant to be superheroes nor wizards, but secret agents among the citizens, among the scrap metal, among the coriander, among the parking meters. We earn respect by calling in worldly secrets, by making them public. The worldly spy is the opposite of the elbow-patched humanist, the one never out of place no matter the place. The traveler at home everywhere, with the luxury to look.

Only when the humanities can earn their own keep will they be respected in modern America. And that will only happen when you convince the majority of people to be interested, of their own volition, rather than begging or guilting them into giving you that money to translate your obscure French poem on vague grounds of “caring about culture.” So either figure something out, or shut up and accept that the humanities are an inherently elite activity that will rely on feudal patronage. Just like they always have. (If you think of Maslow’s hierarchy, it’s obvious why the leisure class, which generally has money, sex, food, and security taken care of, has been in charge of learning.)

You have no idea how much it pains me to say this, but speaking from experience I now believe that private industry is doing a better job of communicating, persuading, innovating, of everything the university has stopped doing. I do not take this as indicator of how well capitalism works, I take it as an indicator of how badly universities have failed, while still somehow aping the worst aspects of corporate capitalism.

Unfortunately, however, traditions that are not passed on from one generation to the next die. If an entire generation grows up largely unexposed to a particular tradition, then that tradition can in essence be said to be dead, because it is no longer capable of reproducing itself. It does not matter whether the tradition in question is imagined as the Western tradition, the Christian tradition, or the Marxist tradition (and of course both Christianity and Marxism are part of the Western tradition). Traditions are like languages: if they are not passed on, they die. Most traditions, of course, have good and bad elements in them (some might argue for Christianity, some for Marxism, relatively few for both), and what dies when a tradition dies is therefore often both good and bad, no matter what one’s perspective. But what also dies with a tradition is any possibility of self-critique from within the tradition (in the sense that Marxism, for instance, constituted a self-critique from within the Western tradition), since a tradition’s self-critique presupposes the existence of the tradition. Therefore the death of a tradition is not just the death of the oppression and tyranny that might be associated with the tradition, but also the death of progressive and liberating impulses within the tradition.

— Views: Sorry – Inside Higher Ed. I like much of this, but it is very, very wrong to suggest that there is a Western tradition — there is not even a Christian tradition. Western culture is largely a series of arguments conducted by proponents of different traditions that had their origin in or near the Western orbit. Christianity likewise has, intellectually speaking, been an extended debate about what it means to follow Jesus Christ and to belong to Him.

The great thing about art is that it’s there whether the academic humanities are or not. And if you make the study of literature and the arts your own, which is incredibly easy to do with the internet being what it is, it means so much more than if you passively imbibe received wisdom in classrooms for grades. The academic humanities may have committed slow suicide – but the art is alive and well, much of it many centuries old. It’s freely available to us. It’s ours.

The American corporate model looks a little battered at the moment, while American universities have become paragons of learning to which all the world aspires. Does it really make sense to refashion Harvard in the image of GM or BP? For all the problems tenure causes, it has proved its value over time—and not only, or mainly, as a way of protecting free speech. Sometimes, basic research in humanities, social science and natural science pays off quickly in real-world results. More often, though, it takes a generation or so for practical implications to become clear. That’s how long it took, for example, for new research (most of it done in universities) which showed how central slavery was to both the life of the South and the outbreak of the Civil War, to transform the way public historians present the American past at historical sites. That’s how long it will probably take for the genomic research that is currently exploding to have a practical impact on medical treatment. Basic research doesn’t immediately fatten the bottom line, even in the fiscal quarter when results are announced. Many corporations have cut or withdrawn their support for it, on strictly economic grounds. In earlier decades, AT&T (later Lucent Technologies), RCA, Xerox and other industrial companies did a vast amount of basic research. AT&T’s Bell Labs, for one, created the transistor and the photovoltaic cell, and mounted the first TV and fax transmissions. But funding fell and corporate priorities changed—and they have shrunk in every sense ever since. Just one thousand employees walk the darkened corridors of Bell Labs, down from a staff of thirty thousand in 2001. We need universities, and tenured professors, to carry on the basic research that most corporations have abandoned. What we don’t need is for universities to adopt the style of management that wrecked the corporate research centers.

As for the argument that the humanities don’t pay their own way, well, I guess that’s true, but it seems to me that there’s a fallacy in assuming that a university should be run like a business. I’m not saying it shouldn’t be managed prudently, but the notion that every part of it needs to be self-supporting is simply at variance with what a university is all about. You seem to value entrepreneurial programs and practical subjects that might generate intellectual property more than you do ‘old-fashioned’ courses of study. But universities aren’t just about discovering and capitalizing on new knowledge; they are also about preserving knowledge from being lost over time, and that requires a financial investment. There is good reason for it: what seems to be archaic today can become vital in the future. I’ll give you two examples of that. The first is the science of virology, which in the 1970s was dying out because people felt that infectious diseases were no longer a serious health problem in the developed world and other subjects, such as molecular biology, were much sexier. Then, in the early 1990s, a little problem called AIDS became the world’s number 1 health concern. The virus that causes AIDS was first isolated and characterized at the National Institutes of Health in the USA and the Institute Pasteur in France, because these were among the few institutions that still had thriving virology programs. My second example you will probably be more familiar with. Middle Eastern Studies, including the study of foreign languages such as Arabic and Persian, was hardly a hot subject on most campuses in the 1990s. Then came September 11, 2001. Suddenly we realized that we needed a lot more people who understood something about that part of the world, especially its Muslim culture. Those universities that had preserved their Middle Eastern Studies departments, even in the face of declining enrollment, suddenly became very important places. Those that hadn’t – well, I’m sure you get the picture.

The rigid scripting of childhood and adolescence has made young Americans risk- and failure-averse. Shying away from endeavors at which they might not do well, they consider pointless anything without a clear application or defined goal. Consequently, growing numbers of college students focus on higher education’s vocational value at the expense of meaningful personal, experiential, and intellectual exploration. Too many students arrive at college committed to a pre-professional program or a major that they believe will lead directly to employment after graduation; often they are reluctant to investigate the unfamiliar or the “impractical”, a pejorative typically used to refer to the liberal arts. National education statistics reflect this trend. Only 137 of the 212 liberal arts colleges identified by economist David Breneman in his 1990 article “Are we losing our liberal arts colleges?” remain, and an American Academy of Arts and Sciences study reported that between 1966 and 2004, the number of college graduates majoring in the humanities had dwindled from 18 percent to 8 percent.

Ironically, in the rush to study fields with clear career applications, students may be shortchanging themselves. Change now occurs more rapidly than ever before and the boundaries separating professional and academic disciplines constantly shift, making the flexibility and creativity of thought that a liberal arts education fosters a tremendous asset. More importantly, liberal arts classes encourage students to investigate life’s most important questions before responsibilities intervene and make such exploration unfeasible. More time spent in college learning about the self and personal values means less floundering after graduation. Despite the financial or, in some cases, immigration-status reasons for acquiring undergraduate vocational training, college still should be a time for students to broaden themselves, to examine unfamiliar ideas and interests, and to take intellectual risks. Otherwise, students graduate with (now dubious) career qualifications but little idea of who they are or who they want to be.

© 2018 Snakes and Ladders

Theme by Anders NorénUp ↑

css.php