...

Stagger onward rejoicing

Tag: humanities (page 1 of 1)

The Decline of Liberal Arts and Humanities – WSJ:

The liberal arts are dead. The number of students majoring in liberal arts has fallen precipitously with data from the National Center for Education Statistics showing the number of graduates in the humanities declined by 29.6% from 2012 to 2020. This decline has worsened in the years since. Notre Dame has seen 50% fewer graduates in the humanities over the same period, while other schools have made headlines recently for cutting liberal-arts majors and minors including Marymount University and St. John’s University. The shuttering of liberal-arts programs has even led to Catholic colleges and universities ending theology programs. 

That’s Danielle Zito, one of several participants in this conversation. I’ll just say once more what I always say: The liberal arts, and the humanities, do not live only or even primarily in universities. They can, and they do, flourish elsewhere, among people who ain’t got time for academic bullshit. 

in brief

If I were a journalist and given the task that Nathan Heller had, here’s the primary (though not the only) thing I’d have done: 

First, I’d have approached professors of English and related humanities fields and said this: “Enrollments in your discipline have been declining for quite some time, and declining dramatically — a decline I assume you’d like to reverse. What do you do in your classes? That is, how do you teach the books you assign? What are your key concerns, your primary educational purposes? What do you most want for your students? And why should they want for themselves what you want for them?” 

Then, having gathered and sorted through answers, I’d have approached students and told them what the professors had said in reply to my questions, and I’d have asked: “Does that sound like something you’d want to pursue?” 

Nathan Heller didn’t do any of this, as far as I can tell. 

burn after reading

Dear colleagues, 

I must congratulate you all on what is, so far, a perfect execution of our Plan. You will recall that when we first met, more than a decade ago, we found ourselves confronted with a dramatic decline in enrollment in university humanities courses — throughout the Western world, but especially in the U.S.A. The self-declared radicals who dominated teaching in the humanistic disciplines seemed determined to alienate students as thoroughly as possible from literature, philosophy, and the arts; meanwhile, parents were frantically pushing their offspring towards courses in business and computer science. Very few young readers and thinkers could resist this double discouragement, especially since the forces doing the discouraging seemed in other respects to stand for opposing visions of what the world should be.

We quickly came to agreement on two points: first, that our chances of restoring the university humanities to their proper calling were so small that we could scarcely justify extending any efforts in that direction; and second, that in any case what matters in the long term is not the university disciplines but rather the cultural achievements that those disciplines once cared for: the novels and plays and poems, the treatises and dialogues, the sonatas and symphonies, the paintings and sculptures and beautifully designed buildings.

The key moment in our deliberations, as I recall, came when one of you reminded us of a (probably apocryphal) statement by the novelist Stendhal, who upon eating ice cream for the first time declared, “This is perfectly delicious. What a pity it isn’t forbidden.” 

What a pity it isn’t forbidden. With that thought our Plan was born. The key, we realized, was to transform the works we love from objects of praise to objects of suspicion: things that required “trigger warnings”  and deserved skeptical critique — perhaps utter denunciation for racism or homophobia or racism or ableism or … anything else we could think of. 

Of course, we had to be careful — we had to work by suggestion and implication. We thought that if we made these accusations directly and explicitly we would be laughed at. Looking back, we can see that our caution was in one sense unnecessary: in this environment, no charge against great works of art could possibly be too outrageous. Still, our caution has served us well: We whispered the quiet part, and our colleagues eagerly said the quiet part out loud. Soon enough they were pronouncing their fatwas day in and day out. 

What a pity it isn’t forbidden — the universal human desire for what we are told to hate and despise is our greatest ally. If we persist in our efforts, perhaps one day even Bach will be wholly excluded from concerts, even Shakespeare from theaters, even Homer and Dante from literature classes … and then the Renewal can at last begin. 

Yours in the Great Cause, 

Comrade Gamma 

education, more or less

Andrew Delbanco:

There is a third — and perhaps the deepest — problem with the futuristic vision of education advanced by “technologically enabled delivery”: the debilitating fact that it rests on a narrow, positivistic conception of knowledge. In this view, all teaching is training, and all learning is a quest for competence: the mastery of some field whose practitioners can expect compensation for their proficiency or expertise. No one should dispute that colleges have a vital responsibility to prepare students for the world of work — to provide them with what the political scientist Benjamin Ginsberg calls “more or less sophisticated forms of vocational training to meet the needs of other established institutions in the public and private sectors.” In fact, preparation for economic productivity has been the main aim of universities since the decline of prescribed curricula in the 19th century, when the introduction of electives and, later, majors aligned what students chose to study in college with the work they planned to do after. Over the past 50 years, as students from economically insecure families entered college in growing numbers, this alignment has only become tighter, including at elite institutions that serve predominantly affluent students. “It is a shame,” Ginsberg writes, “when that is all that the university offers.” “All” is an exaggeration, but at more and more institutions it’s a fair approximation.

What’s increasingly rare in higher education, and almost entirely missing from writings about its future, is a more than nominal commitment to the value of learning undertaken in the hope of expanding the sympathetic imagination by opening the mind to contesting ideas about nature and history, the power of literature and art, and the value of dialectic in the pursuit of truth. These aspirations — traditionally gathered under the term “liberal education” — are in desperate need of revival. To advance them requires teachers and institutions committed to a more capacious vision of education than the prevailing idea of workforce training and economic self-advancement. 

There will always be many people who want more from their education than “workforce training and economic self-advancement,” but they may not want it from universities. They may perceive — and surely one could not blame them for coming to this conclusion — that the modern Western university is incapable of providing anything else. And in that case they’ll continue to seek credentials from universities but look to private instruction or para-academic organizations for education

William Davies

Before March 2020, I was unfamiliar with the phenomenon of ‘guided reading’. My daughter (aged eight during the school closures that year) was sometimes required to read the same short passage five days in a row and to perform different tasks in relation to it. Presumably the idea was for her to learn how specific sentence constructions work, in the hope that she would be able to apply that knowledge elsewhere – but the invitation to write autonomously, beyond a sentence or two, never arrived. It wasn’t merely the emphasis on obscure grammatical concepts that worried me, but the treatment of language in wholly syntactical terms, with the aim of distinguishing correct from incorrect usage. This is the way a computer treats language, as a set of symbols that generates commands to be executed, and which either succeeds or fails in that task.

This vision of language as code may already have been a significant feature of the curriculum, but it appears to have been exacerbated by the switch to online teaching. In a journal article from August 2020, ‘Learning under Lockdown: English Teaching in the Time of Covid-19’, John Yandell notes that online classes create wholly closed worlds, where context and intertextuality disappear in favour of constant instruction. 

Almist every structural element of Western education, on all levels, militates against humane learning. 

the whys and wherefores of humanistic study

Louis Menand:

Reading Weinstein and Montás, you might conclude that English professors, having spent their entire lives reading and discussing works of literature, must be the wisest and most humane people on earth. Take my word for it, we are not. We are not better or worse than anyone else. I have read and taught hundreds of books, including most of the books in the Columbia Core. I teach a great-books course now. I like my job, and I think I understand many things that are important to me much better than I did when I was seventeen. But I don’t think I’m a better person.

Sure, but have you tried to be?

I don’t want to be too flippant here. I have myself, many times, argued against the claim that reading great books is intrinsically ennobling. My argument has been that how we read — with whom, and with what purposes in mind — matters more than what we read. But I think it’s both possible and desirable for students and teachers to read and study together in search of wisdom. And that’s largely what Montás and Weinstein argue for. I do think they are a rather too reverent about the books officially designated as Great, rather too confident in the value of mere exposure to such books. Still, the purposes of education, of teaching and learning, are their chief theme.

They also — Menand complains about this at length — say that in the modern university the practices of reading associated with those true purposes of education are either marginalized or forbidden. Menand actually quotes Montás’s argument that “Too often professional practitioners of liberal education — professors and college administrators — have corrupted their activity by subordinating the fundamental goals of education to specialized academic pursuits that only have meaning within their own institutional and career aspirations.” But isn’t it pretty obvious that someone who thinks this will not, indeed cannot, also think “that English professors, having spent their entire lives reading and discussing works of literature, must be the wisest and most humane people on earth”?

Menand is so transparently impatient with the arguments of Montás and Weinstein that he gets similarly confused at several points in his essay. For instance, to Montás’s claim that Nietzsche is “Satan’s most acute theologian,” Menand replied that “Nietzsche wanted to free people to embrace life, not to send them to Hell. He didn’t believe in Hell. Or theology” — or, presumably, Satan. But maybe Montás believes in Satan, which would, surely, be the point.

Let’s try to focus on the key issue here: Both authors under review couldn’t be clearer that their main argument is not that we should all be reading great books but that we should be reading great books in light of what they believe to be “the fundamental goals of education.” So the proper question to ask is not, “Will reading great literature make me a better person?” The proper question is, “Will reading great literature in company with others who are in search of self-knowledge, meaning, and wisdom make me a better person?” (Answer: Maybe not. And not automatically. But it does give you certain opportunities for becoming wiser if in a disciplined way you choose to take them.)

Why does Menand teach in a university? I read his essay three times looking for evidence, and the only thing I find is this: “The university is a secular institution, and scientific research — more broadly, the production of new knowledge — is what it was designed for.” (He sets this in opposition to the argument of the writers he review that it’s vital to read for self-knowledge.) He says that “Humanists … should not be fighting a war against science. They should be defending their role in the knowledge business.”

With this point I’m in complete agreement. Wissenschaft should not be simply dismissed in favor of Bildung, and there can be real Wissenschaft in the humanities (though I’m not sure many of the critics and theorists Menand defends are interested in either educational tradition). I agree with Tom Stoppard’s A. E. Housman, in The Invention of Love:

A scholar’s business is to add to what is known. That is all. But it is capable of giving the very greatest satisfaction, because knowledge is good. It does not have to look good or even sound good or even do good. It is good just by being knowledge. And the only thing that makes it knowledge is that it is true. You can’t have too much of it and there is no little too little to be worth having. There is truth and falsehood in a comma.

Among my own writing and works, the ones I am proudest of are the ones — like my Auden editions — in which I am clearly if in a small way adding to our store of knowledge.

So two cheers, at least, for Wissenschaft. The chief point I’m wanting to make here is simply this: There’s something rather peculiar about a scholar who proudly disavows using professional teaching and study for personal moral formation and then says, as though he’s clinching a point, that his professional teaching and study have not contributed to his personal moral formation.

indices

Lots of enjoyable coverage of Dennis Duncan’s new and wonderfully titled Index, A History of the. (You may find a brief excerpt from it as a work-in-progress here.)

The best thing I’ve seen is (unsurprisingly) by Anthony Grafton in the LRB. Here’s a noteworthy passage:

Indexing flowered at a time when readers saw books as great mosaics of useful tags and examples that they could collect and organise for themselves. The chief tool they used to do this job was not the index but the commonplace book: a notebook, preferably organised by subject headings known as loci communes (common places), which provided both a material space in which to store material and a set of designators to help retrieve it. Like indexes, commonplace books were wildly popular. John Foxe, a corrector of the press during his exile years in Switzerland, printed two editions of a blank commonplace book designed to be filled in by users under Foxe’s various headings. One such user, Sir Julius Caesar, crammed his copy so full of notes and excerpts that the British Library classifies it as a manuscript. In his preface to the first edition, Foxe made clear that he knew he was riding a wave. He worked for Johannes Oporinus, the Basel printer who brought out the Magdeburg Centuries, a vast Protestant history built on commonplace books compiled by students. Foxe pointed out, with relish, that everyone who was anyone was making commonplace books and telling others how to do so. Humanists commonplaced to ready themselves for effective speech and writing, medical men, lawyers and theologians to prepare for their professions. Foxe argued that the commonplace book was vastly superior to the index. The index might point you to the materials you need, but the commonplace book actually provided them, handily organised. Holmes, like his contemporary Aby Warburg (another dab hand with scissors and paste), was a late master of the commonplace book, even if he called it an index. If printing made indexes easy to compile, commonplacing made them indispensable to printers and readers alike. 

In response to the TLS’s review, my dear friend Tim Larsen wrote in with the following: 

I can confirm that one does indeed need a human indexer with a decent knowledge of the topic at hand. I thought I could safely ignore the index for my Oxford Handbook of Christmas (2020), but when it came back with an entry on “Joseph, father of Jesus,” I knew I had to step in. Mercifully, it went to press as “Joseph, husband of Mary.” As to humour in indexes, one of my favourites is E. P. Sanders’s Paul and Palestinian Judaism (1977). Its index has an entry for “Truth, ultimate.” The three pages listed under it lead one to the blank pages that divide the book’s parts. 

Let me register a vote for My Favorite Indexer: Hugh Kenner. In some of his books he not only tabulated references to key figures, he offered memorable single-word descriptions of many of them. A few examples from his book on Irish writers, A Colder Eye

IMG 2714

 

IMG 2713

Of course, the delight here is turning to the appropriate pages to find out just how Kenner settled on the identifications he chose.  

Permanent Crisis

Paul Reitter and Chad Wellmon’s Permanent Crisis: The Humanities in a Disenchanted Age will be on sale tomorrow, and it’s an absolutely essential book for anyone who cares about the humanities — or even just thinks about the humanities. You may read the introduction (as a PDF) here, and also read an interview with Paul and Chad here. I expect to have more to say about the book later, but just for now I want to make a couple of introductory points about what I consider to be the two chief themes of the book — themes woven together skillfully in the development of the argument. 

The first key theme is obvious from the title: It is the nature of the modern humanities to be always in crisis. From the Introduction: 

One of our chief claims is that the self-understanding of the modern humanities didn’t merely take shape in response to a perceived crisis; it also made crisis a core part of the project of the humanities. The humanities came into their own in late nineteenth-century Germany by being framed as, in effect, a privileged resource for resolving perceived crises of meaning and value that threatened other cultural or social goods as well. The perception of crisis, whether or not widely shared, can focus attention and provide purpose. In the case of the humanities, the sense of crisis has afforded coherence amid shifts in methods and theories and social and institutional transformations. Whether or not they are fully aware of it, for politically progressive and conservative scholars alike, crisis has played a crucial role in grounding the idea that the humanities have a special mission. Part of the story of why the modern humanities are always in crisis is that we have needed them to be. 

I don’t think you could read this book with any care and come away doubting the truth of this claim. The second major claim I want to call attention to may seem at first to be rather different, but in fact is closely related to the first. From Chapter Seven:

For decades, the humanities have arrogated to themselves critique and critical thinking, and thus they asserted a privileged capacity to demystify, unmask, reveal, and, ultimately, liberate the human from history, nature, or other humans. Whether as Judith Butler’s high-theory posthumanism or Stephen Greenblatt’s historicist communion with the dead, the humanities have claimed sole possession of critique and cast themselves as custodians of human value. In order to legitimize such claims and such a self-understanding, the modern humanities needed the “disenchantment of the world” and needed as well to hold the sciences responsible for this moral catastrophe. Only then could their defenders position themselves as the final guardians of meaning, value, and human being.

The success of the sciences — and more particularly, I would say, of the technocracy that arose from the explosion of scientific knowledge over the past two hundred years — provides the entire context for understanding the character of the modern humanities: the account of the virtues and goods the humanities claim to be the unique guardians of, and the account of their guardianship as being under constant existential threat. 

But this raises a crucial and uncomfortable question: Can the humanities relinquish their crisis narrative without also relinquishing their unique guardianship of humane values such as “critical thinking”? We could scarcely envy a model of humanistic learning that wasn’t in crisis only because, like Othello, it’s occupation’s gone. 

In the interview linked to above, Len Gutkin asks whether the humanities can get along without a crisis narrative, and Reitter replies: “Can the humanities do without crisis talk? Probably not, unless there was some massive reorganization of society where there didn’t seem to be a fundamental tension between the pace of capitalism and the pace of humanistic thinking.” This is perhaps more bluntly pessimistic than the book itself, which tries in its Conclusion to suggest some resources that could lead to a Better Way, notably (a) Edward Said’s reading of Erich Auerbach and (b) Max Weber’s Wissenschaft als Beruf — which Reitter and Wellmon have recently edited, in a new translation by Damion Searls

I’d like to make another suggestion, based on a theme Reitter and Wellmon contemplate early in Permanent Crisis but don’t pursue at the book’s end:  

Recent efforts among scholars to establish the history of humanities as a distinct field started with a question: “How did the humanities develop from the artes liberales, via the studia humanitatis, to modern disciplines?” Our question is slightly different: Have the continuities linking the humanist scholarship of the faraway past to that of today been stretched thin? Or have they, or some of them, remained robust? These are, of course, big questions, and we won’t treat them comprehensively, let alone try to resolve them. But we do begin with the premise that the continuities between the modern, university-based disciplines collectively known as the humanities and earlier forms of humanist knowledge such as the studia humanitatis have been exaggerated. The modern humanities are not the products of an unbroken tradition reaching back to the Renaissance and, ultimately, to Greek and Roman antiquity. 

I think this is correct. But I also think Reitter and Wellmon are correct when they write, elsewhere in the book, “The current institutional arrangement of university-based knowledge — with its particular norms, practices, ideals, and virtues — was not necessary; it could have been otherwise.” 

So here’s what I’m getting at: It may well be that the humanities are chained permanently to their crisis narrative as long as they function within “the current institutional arrangement of university-based knowledge.” That is: If we do not get the kind of “massive reorganization of society” that Reitter mentions, it’s likely that the humanities can only have a self-understanding not dependent on a crisis narrative if they learn to operate outside the structures of the modern research university. And if that were to happen, then the old ways of the studia humanitatis might turn out to be more relevant than they have been in the past several centuries.

China’s intellectual future

Epistemic confidence level: very low. I don’t really know what I’m talking about here, though I’m trying to know more.


In an essay published a few months ago, I looked for a way beyond what I call the Standard Critique of Technology, and suggested that one such way could be found in certain Daoist ideas about the human-built world. And I speculated that, as China grows in power and influence, and insofar as the Chinese technosphere draws upon elements of ancient intellectual traditions, there is at least the possibility that some of those ideas could make their way to the U.S. 

In light of all that, I read with great interest this essay in the LRB by Peter C. Perdue. The essay is essentially an overview of the work of a Chinese historian Ge Zhaoguang, with a particular emphasis on this new book. Ge is a passionate advocate for the Confucian intellectual tradition, and sees it not just as valuable in itself but also a kind of binding agent for Chinese culture, a project towards which Perdue is extremely skeptical. He thinks such a return to the past is necessarily Han-centric and therefore manifestly inadequate to the multicultural society that China has become. 

Yet Perdue sees the need for some kind of project of cultural and moral renewal — that is, he fully understands why Ge writes as he does. He notes that 

In 1895, China suffered a crushing military defeat at the hands of the Japanese and then came under assault from its own intellectuals, who were heavily influenced by the Western thought that had arrived with the conquerors. The result was an almost complete rejection of the Confucian tradition over the next ninety years. Ge ends his Intellectual History of China in 1895, the year in which Yan Fu, the famous translator of Darwinist thought, wrote of the extreme ‘nervous anxiety’ afflicting intellectuals of his time. Ge does not follow the story into the next two decades, when genuine faith in the classical tradition almost completely collapsed. As he writes, by 1895, the loss of territory, cultural confidence, unity and common historical identity had ‘undermined the integrity of the classical tradition’. The abolition of the examination system in 1905 and the collapse of the dynasty in 1911 sealed its fate. But China had faced foreign invasion and cultural challenges before without the collapse of the Confucian tradition. 

Other forces have rushed in to fill the gap created by the collapse of the Confucian system, Maoist Marxism above all — but not only that: “After Mao, Chinese endorsed cowboy capitalism of the most corrupt, environmentally destructive kind. Like all of us, they struggle to restrain capitalist greed with moral or legal norms; many of them, amazingly enough, have turned to Christianity for answers, but others search for guidance in Buddhism, Daoism, popular cults, and even Confucius.”

Perdue himself is appalled by “cowboy capitalism,” appalled by the rise of Christianity in China, appalled by Ge’s desire for a return to Classical Chinese thought. Indeed, he seems appalled by every option except China coming to embody the worldview of — well, of the LRB and its readers, I guess. “Is an alternative intellectual history of China possible? If so, it would not seek to define the unity of Chinese civilisation, but to celebrate its multiplicity. It would look to centrifugal forces, marginal figures and frontier contacts as sources of innovation, not threats to order. It would include women, non-Han peoples and non-elite traditions without trying to co-opt them into an orthodoxy.” (I’m not saying that these would be bad things, but it’s noteworthy that, while lamenting the influence in China of Western models of political economy, Perdue can only imagine a good future for China that’s based on a Western leftist cultural frame. As though the possibilities for China’s future involve choosing items from a buffet laid out by the West.) 

In my essay for The New Atlantis I invoked the fascinating work of Tongdong Bai — especially his Against Political Equality — and speculated that there might be some interest within the CCP in renewing Confucianism or other elements of the classical tradition. Perdue thinks this is happening, at least on the level of public declaration: “The Chinese state now invokes the once despised Confucius as the bedrock of native values.” But Tongdong Bai isn’t so sure, as he explained in an email to me that he has graciously given me permission to quote from: 

I don’t think CCP is that friendly to Confucianism…. My interest in Confucian political philosophy is not directly related to the mainland Chinese government and policies. Although my dear friend Daniel Bell sometimes links his proposal to the present Chinese regime, I am fundamentally different from him in this regard. What I am proposing is normative, and I don’t think the contemporary Chinese regime offers any real-world exemplification of my proposals. 

So what do I — again, a novice, if a fascinated one — make of all this?  A few things: 

  • There is a kind of crisis of confidence among China’s intellectual elite, a sense that the power of their state is growing faster than the moral order of their culture; 
  • For many, capitalism (whether of the cowboy variety or any other) is inadequate to providing the necessary moral order; 
  • For those many, certain religious traditions — whether the embodied-in-religious-practice forms of the Three Ways (Confucianism, Daoism, Buddhism) or Christianity — offer themselves as alternatives; 
  • None of these alternatives is yet strong enough to displace or even really to affect the juggernaut of CCP capitalism; 
  • The hopes I articulate in my essay for a renewal of Daoism seem unlikely, in part because, as both Ge and Perdue explain, Daoism has been largely incorporated into Confucianism, or redefined by Confucianism; 
  • Christianity might therefore be a better hope for providing a moral order for Chinese culture; 
  • But Christianity, though I believe it to be true, has never developed the kind of coherent critique of technocracy that Daoism has long embodied; 
  • So if that new moral center emerges, it’s unlikely to offer significant resistance to the increasingly technocratic character of the Chinese regime and therefore the Chinese nation. 

the Great Crumping revisited

A surprising number of readers of my previous post have written out of concern for my state of mind, which is kind of them, but I think they have read as a cri de coeur what was meant as a simple summary of the facts. Not pleasant facts, I freely admit, but surely uncontroversial ones. Stating them so bluntly is just one element of my current period of reflection.

The primary reason I am not in despair is simply this: I know some history. I think we will probably see, in the coming decades, the dramatic reduction or elimination of humanities requirements and the closure of whole humanities departments in many American universities, but that will not mean the death of the humanities. Humane learning, literature, art, music have all thrived in places where they were altogether without institutional support. Indeed, I have suggested that it is in times of the breaking of institutions that poetry becomes as necessary as bread.

Similarly, while attendance at Episcopalian and other Anglican churches has been dropping at a steep rate for decades, and I expect will in my lifetime dwindle to nearly nothing, there will still be people worshipping with the Book of Common Prayer as long as … well, as long as there are people, I think. And if evangelicalism completely collapses as a movement — for what it’s worth, I think it already has — that will simply mean a return to an earlier state of affairs. The various flagship institutions of American evangelicalism are (in their current form at least) about as old as I am. The collapse I speak of is, or will be, simply a return to a status quo ante bellum, the bellum in question being World War II, more or less. And goodness, it’s not as if even the Great Awakening had the kind of impact on its culture, all things demographically considered, as one might suspect from its name and from its place in historians’ imaginations.

This doesn’t mean I don’t regret the collapse of the institutions that have helped to sustain me throughout my adult life. I do, very much. And I will do my best to help them survive, if in somewhat constrained and diminished form. But Put not your trust in institutions, as no wise man has even quite said, even as you work to sustain them. It’s what (or Who) stands behind those institutions and gives them their purpose that I believe in and ultimately trust.

The Great Crumping is going on all around me. But if there’s one thing that as a Christian and a student of history I know, it’s this: Crump happens.

as essential as bread

People often talk about the death of the humanities, or hard times for the humanities, but when they do what they really mean is the death of or hard times for humanities departments in universities. But humanities departments in universities are not the humanities. There’s a really interesting passage in Leszek Kołakowski’s book Metaphysical Horror in which he discusses the many times that philosophers have made a “farewell to philosophy,” have declared philosophy dead or finished. Kołakowski says that when all of the “issues that once formed the kernel of philosophical reflection” have been dismissed by professional philosophers, that doesn’t actually silence the ancient questions that once defined philosophy.

But such things, although we may shunt them aside, ban them from acceptable discourse and declare them shameful, do not simply go away, for they are an ineradicable part of culture. Either they survive, temporarily silent, in the underground of civilization, or they find an outlet in distorted forms of expression.… Excommunications do not necessarily kill. Our sensibility to the traditional worries of philosophy has not weathered away; it survives subcutaneously, as it were, ready to reveal its presence at the slightest accidental provocation.

I believe this to be true, and therefore, while I am concerned by the likely fate of our universities and by the ways that so many professors in the humanities have been stupidly complicit in their own demise, I don’t worry about the death of the humanities. Though I don’t agree with Philip Larkin’s view that people will eventually stop worshipping in churches, I think he’s right when he says, near the end of “Church Going,” that ”someone will forever be surprising / A hunger in himself to be more serious,” and when that happens such a person will gravitate towards all the great works of the past — and of the present too, though that will feel less necessary — even though the whole of society heaps scorn on all that our ancestors have done.

Here’s how we’ll know that things have gotten really bad in our society: People will start turning to Homer and Dante and Bach and Mozart. Czeslaw Milosz — like Kołakowski, a Pole, perhaps not a trivial correspondence — wrote that “when an entire community is struck by misfortune, for instance, the Nazi occupation of Poland, the ‘schism between the poet and the great human family’ disappears and poetry becomes as essential as bread.”

I think often, those days, of Emily St. John Mandel’s haunting novel Station Eleven, in which the great majority of human beings have been killed by a deadly plague, and those who remain live at a near-subsistence level. Even so, some musicians and actors have banded together to form an itinerant troupe, the Travelling Symphony.

The Symphony performed music — classical, jazz, orchestral arrangements of pre-collapse pop songs — and Shakespeare. They’d performed more modern plays sometimes in the first few years, but what was startling, what no one would have anticipated, was that audiences seemed to prefer Shakespeare to their other theatrical offerings.

“People want what was best about the world,” Dieter said.

on misunderstanding critical theory

Recently there’s been a lot of talk among conservatives about “critical theory,” and it’s been puzzling me. So finally I looked into the matter and think there’s some confusion that needs to be sorted out.

The person who has been leading the charge in the identification and denunciation is James Lindsay, of the grievance studies hoax fame, and he has helped to generate a whole discourse about critical theory, much of which you can find at Areo Magazine. If you look at the essays there, you’ll see some that identify critical theory quite closely with the works of Theodor Adorno and Max Horkheimer — the leading members of the so-called Frankfurt School — but then others, who clearly think that they’re talking about the same phenomenon, lump Adorno and Horkheimer together with thinkers who differ from them quite dramatically, like Heidegger and Maurice Merleau-Ponty. Lindsay himself uses the term “critical theory” in extraordinarily flexible ways, sometimes quite narrowly and sometimes expansively. It can be hard to tell in any given sentence of his what the intended range of reference is.

To someone like me who has been studying and teaching and writing about this stuff for thirty years, the whole discourse is pretty disorienting because, frankly, so much of it is just wrong. It’s like listening to people talking about a “Harvard school” of political theory that features John Rawls and Robert Nozick; or a “California school” of governance to which both Ronald Reagan and Gavin Newsom belong.

However, the folks who write for Areo didn’t arrive at this confusion all by themselves. It’s endemic to the humanistic disciplines, in which “theory” can be used in many ways, some of which involves the acts of social and cultural and literary “criticism” — which of course is also an ambiguous word, since it can denote close attentiveness or a negative view of something. I am an Auden critic, but that doesn’t mean I am critical of Auden. All these things get mixed up together, and have done so for a long time. Decades ago, when I started teaching a class on these themes at Wheaton College, the class was called “Critical Theory,” but what it was really about was “Literary Theory.” I asked for the name of the course to be changed because I thought that the phrase “critical theory” should be reserved for the Frankfurt School tradition, but several of my colleagues were puzzled by this request, thinking that “critical theory” and “literary theory” were functionally synonymous terms. I seem to recall one saying that the existing description was better because the class was really about literary criticism rather than literature as such. Theory of criticism = critical theory.

So no wonder Lindsay and his colleagues get confused. But let’s try to straighten things out a bit.

In the broadest sense, literary theory and cultural theory are academic disciplines based on the conviction that the ways we think about our humanistic subjects are not self-evidently correct and require investigation, reflection, and in some cases correction. This impulse arises in part from the experience of teaching, in which we discover that our students tend to do things with literary and historical texts — for instance, decide whether they like a book or a historical account on the basis of whether or not they “relate” to its most prominent characters — that we would prefer them not to do. But also, there was some 80 years ago a fight in America (it happened earlier and rather differently in the U.K.) to convince academic administrations that the study of literature was not simply impressionistic, like some higher book club, and writing about literature was not merely belletristic. Rather, we’re doing serious, disciplined academic work over here! And to prove it, and then to teach our students, we’re developing a theory.

All this ferment is of course related to science envy: the need to reckon with the fear that science has a method and humanistic study does not. A theory at least approximates a method, and there arose some considerable agitá about whether there’s anything scientific (truly methodical) about what literary critics do. The most influential literary critic of the middle of the twentieth century, Northrop Frye, said in his landmark book Anatomy of Criticism that literature is not a science but literary criticism is, or should be. Not everyone agreed, but everyone did seem to think that literary criticism needed to give an account of itself, needed to specify and enumerate its procedures. In a famous essay, Steven Knapp and Walter Benn Michaels defined theory as “the attempt to govern interpretations of particular texts by appealing to an account of interpretation in general.” They didn’t think this could actually work, which is why they called their essay “Against Theory,” but even people who agreed that it didn’t work, like Stanley Fish, still acknowledged the necessity of “theory talk.”

So this theory talk — which started in Europe well before the likes of Northrop Frye came around — spawned a proliferation of schools, and not just in literary study but also in other humanistic disciplines, among which there was a great deal of overlap in terminology and approach. And, in relation to the arguments that James Lindsay makes, almost none of this was closely related to the Frankfurt School’s “critical theory.” Adorno and Horkheimer and friends had some influence, to be sure, but not not nearly as much as, say, the structuralism that made its way into literary study via Claude Levi-Strauss’s anthropology, or the various psychological theories that stemmed from the work of Freud and Jung. Then came post-structuralism, deconstruction, feminism, gender theory, body theory, postcolonial theory, ecocriticism — all of which were critical and theoretical but usually had only minimal overlap with the concerns of the Frankfurt School. (The figure that I think most generative for Critical Race Theory, Franz Fanon, was in no way connected to the Frankfurt School’s critical theory, as far as I can tell. All of his guiding lights were French.) Certainly each movement operated according to their own internal logic.

But there is among all of these a family resemblance, just not one in which the Frankfurt School has any kind of initiating role. All of these movements assume that (a) most of the time we don’t really know what we’re doing and (b) we’d rather not know, because if we did know why we do the things we do we might not like it. The French philosopher Paul Ricoeur famously wrote that all of these recent movements descend from the three great nineteenth- and early-twentieth-century “masters of suspicion”: Marx, Nietzsche, and Freud, the “destroyers” of the common illusions from which we derive so much of our placid self-satisfaction. So if you’re going to blame anyone for the corrosive skepticism of Critical Race Theory and the like, you’ll need to start well before the Frankfurt School.

And one more thing: as Ricoeur knew perfectly well, there were other great destroyers of illusions in the nineteenth century, perhaps chief among them Kierkegaard and Dostoevsky — but those two did it in the name of the Christian faith. Because no less than Marx, Nietzsche, and Freud, Kierkegaard and Dostoevsky knew that the human heart is deceitful above all things, and added that it is deceitful in ways that we cannot through our own efforts fix. Perhaps the chief problem with the masters of suspicion, and their heirs, is not that they are too suspicious but that they are not suspicious enough. Especially about themselves.

another case for the humanities

Samuel Johnson was impressed by the intelligent and ambitious structure of learning that Milton laid out in “Of Education,” but, as he explains in his Life of Milton, he thought the scheme gave too much attention to what Milton would have called “natural philosophy” and what we call “science.”

But the truth is that the knowledge of external nature, and the sciences which that knowledge requires or includes, are not the great or the frequent business of the human mind. Whether we provide for action or conversation, whether we wish to be useful or pleasing, the first requisite is the religious and moral knowledge of right and wrong; the next is an acquaintance with the history of mankind, and with those examples which may be said to embody truth and prove by events the reasonableness of opinions. Prudence and Justice are virtues and excellences of all times and of all places; we are perpetually moralists, but we are geometricians only by chance.

humans, humanity, humanism

At the outset of my book The Year of Our Lord 1943 I describe my narrative method using a cinematic metaphor: I compare it to the famous opening tracking shot of Orson Welles’s Touch of Evil. A couple of people have asked me whether a musical metaphor wouldn’t have been more appropriate, given that I explore different voices, and indeed at one point I considered employing one, but it seemed a little too obvious — and in any case, polyphony implies harmony, and I didn’t want to suggest that. Along these very lines, I’ve been wanting to say a little more about how I understand my own book, and I’m given an excuse by Chad Wellmon’s generous reflection on it, which you should read less as a review than as an illuminating and provocative essay about the uses and limitations of the language of humanism and the human. Please read it and then come back!


Okay. As I was saying, I have some thoughts about what my story adds up to — which may sound like an odd thing to say, but the book really is a story rather than an argument, and while I don’t think my views about what the story says are necessarily better than anyone else’s, I do have views, and I want to describe some of them here. I’ve also done that in two long interviews about the book, one with Wen Stephenson and one with Robert Kehoe.  As I have explained in those interviews, the book began when I noticed that in January of 1943 three thinkers — Jacques Maritain, C. S. Lewis, and W. H. Auden — were all writing with great intensity about education, which struck me as a strange thing to be doing in the middle of a war. Later, I saw that T. S. Eliot and Simone Weil were working along similar lines, and realized that I had five figures to write about, not just three. And their ideas, I discovered, went in and out of sync with one another in fascinating ways — ways that would be lost if I wrote a conventional academic sort of treatise in which I wrote one chapter about each author and sandwiched those with an Introduction and Conclusion. I would have to write a braided narrative in order to pick up properly on the moments of synchronization and the moments of asynchronization, or, to pick up that rejected musical metaphor, moments of harmony and moments of dissonance.

(By the way, getting the braiding right was the chief challenge I faced in writing this book, and I don’t know what it means that almost no one reviewing the book has commented on how it’s structured — Philip Jenkins being an extremely gratifying exception to that rule. I am going to assume that people haven’t talked about the narrative structure because I handle it so elegantly.)

Among my five protagonists, the one who is most consistently out of sync with the others is Simone Weil. She is the odd person out in several ways — the only woman, the only Jew, the youngest of the five — but in any group of any kind Weil would be the odd person out, because few minds have ever been as distinctive and original as hers. Again and again she offers philosophical and historical arguments that set her quite apart from the analytical frameworks of the other figures. (None of the others could have seen a causal chain leading from the Catholic campaign against the Cathars in 12th-century Languedoc to 20th-century National Socialism.) Looking at my book now, I find myself thinking that Weil is the central figure, the one indispensable figure, in it, and the one who, despite her intellectual eccentricities and even perversions, has the most to say to us now.

When in his essay Chad sets Karl Barth against the five figures in my book, I think he misses this; I think he attributes to them more unity of purpose and even vocabulary than they had, and especially fails to note just how radically strange Weil’s ideas are. Also, he attributes to all five of them a nostalgia that I think is actually characteristic only of the three older figures: Auden and Weil understand with absolute clarity that neither previous forms of humanism nor Christendom can be revived, and that the attempts to go back to either are misbegotten. Consider, to take just one example among many possible ones, Auden’s review of Charles Norris Cochrane’s Christianity and Classical Culture:

Our period is not so unlike the age of Augustine: the planned society, caesarism of thugs or bureaucracies, paideia, scientia, religious persecution, are all with us. Nor is there even lacking the possibility of a new Constantinism; letters have already begun to appear in the press, recommending religious instruction in schools as a cure for juvenile delinquency; Mr. Cochrane’s terrifying description of the “Christian” empire under Theodosius should discourage such hopes of using Christianity as a spiritual benzedrine for the earthly city.

“Spiritual benzedrine for the earthly city” is a great phrase. (By the way, I have written about the importance of Cochrane’s book here and about benzedrine here. Just FYI.)

Anyway — and this continues my response to Chad — my five protagonists also differ from one another in their use of the term “humanism.” Perhaps only Maritain uses it in the way that Chad critiques in his essay; Auden uses the term rarely and Lewis not at all (except to describe the humanist movement of the early modern period). I describe these variations at considerable length in my book, and yet if Chad sees all my protagonists as standing under this humanist umbrella, it’s largely my fault, because despite all my reservations I use the word myself — and even allowed it a place in my subtitle.

I say “allowed” because the subtitle I submitted was: “Christian Intellectuals and Total War.” But when my editor, Cynthia Read, changed it to “Christian Humanism in an Age of Crisis” I didn’t contest the change. Partly that was because I know from experience how insistent editors can be about titles and subtitles, and at least my title wasn’t being questioned; partly because I wasn’t sure how widely understood the term “total war” is; partly because my book isn’t about “Christian intellectuals” in general, and I didn’t want to give a false impression on that count; and partly because I think “humanism” encompasses better than any other term I can think of the general interests that bind my five protagonists, that make the story I tell a coherent one. So I remain ambivalent about that subtitle, and especially about the term “humanism,” but don’t really know what the alternative would be.

I think there are certain points that all five of my protagonists would have agreed upon: That they were living in a moment when it was important to stress that there are certain things that all people have in common, regardless of race or nationality; that the great authoritarian and totalitarian movements of their time tended to obscure those great commonalities; that there is such a thing as “human flourishing” (though none of them would have used that term); that such flourishing happens neither when people are reduced to faceless units in a collective nor when they conceive of themselves as atomistic “individuals”; that Christianity provides the best account of what a flourishing human person is and how such flourishing may be achieved. Do we call that “Christian humanism”? Can the term be retrieved and redeployed to good effect? (Maybe?)

Would Karl Barth have dissented from any of that? I don’t think so. He would have differed from my protagonists — as they differed among themselves — about what public language is adequate to the expression of these ideas. More substantively, I think, Barth might have questioned my protagonists’ belief in the role that the study of humane letters, and of the liberal arts, might play in educating people in a rich and deep account of, to borrow a phrase from Bruce Cockburn, “what humans can be.” In any case he, more than any of them, would have made a place in theological reflection for Mozart, who “heard, and causes those who have ears to hear, even today, what we shall not see until the end of time — the whole context of providence.”

Karl Barth

the imminent collapse of an empire

Writers generally don’t get to choose the titles of their pieces, but the confusion in the title and subtitle of this report by Alexandra Kralick — Are we talking about sex or gender? I mean, it’s not like bones could tell you anything about gender — is reflected in the report itself. Sometimes it’s about “the nature of biological sex”; at other times it’s about the false assumptions that arise from gender stereotypes. Kralick weaves back and forth between the two in unhelpful ways.

On the specific question of whether sex is binary, and the contexts in which that matters, if you want clarity you’d do well to read this essay. But for the moment I’m interested in something else.

There’s a passing comment in Kralick’s essay that caught my attention: “The perception of a hard-and-fast separation between the sexes started to disintegrate during the second wave of feminism in the 1970s and ’80s.” The phrase “second-wave feminism” has been used in various and inconsistent ways, but it is typically associated with “difference feminism,” an emphasis on “women’s ways of knowing” being different than those of men. And in that sense it’s better to say that “the perception of a hard-and-fast separation between the sexes started to disintegrate” as a result of the critique of second-wave feminism as being too “essentialist” in its modeling of sexuality and gender. The most influential figure in that critique was Judith Butler, whose book Gender Trouble set in motion the discourse about gender as choice, gender as performance, gender as fluid and malleable, that we see embodied in Kralick’s essay.

So while I don’t think Kralick has the details of the history quite right, she’s definitely correct to suggest that scientists are having this conversation right now — or not so much having a conversation as making declarations ex cathedra — as a direct result of intellectual movements that began in humanities scholarship twenty-five years ago.

So for those of you who think that the humanities are marginal and irrelevant, put that in your mental pipe and contemplatively smoke it for a while.

Many years ago the great American poet Richard Wilbur wrote a poem called “Shame,” in which he imagined “a cramped little state with no foreign policy, / Save to be thought inoffensive.”

Sheep are the national product. The faint inscription
Over the city gates may perhaps be rendered,
“I’m afraid you won’t find much of interest here.”

The people of this nation could not be more overt in their humility, their irrelevance, their powerlessness. But …

Their complete negligence is reserved, however,
For the hoped-for invasion, at which time the happy people
(Sniggering, ruddily naked, and shamelessly drunk)
Will stun the foe by their overwhelming submission,
Corrupt the generals, infiltrate the staff,
Usurp the throne, proclaim themselves to be sun-gods,
And bring about the collapse of the whole empire.

Hi there scientists. It’s us.

reasons for decline

Alex Reid

From a national perspective, the number of people earning communications degrees (which was negligible in the heyday of English majors 50-60 years ago), surpassed the number getting English degrees around 20 years ago. Since then Communications has held a fairly steady share of graduates as the college population grew, while English has lost its share and in recent years even shrank in total number, as this NCES table records. In short, students voted with their feet and, for the most part, they aren’t interested in the curricular experience English has to offer (i.e. read books, talk about books, write essays about books). 

Scott Alexander

Peterson is very conscious of his role as just another backwater stop on the railroad line of Western Culture. His favorite citations are Jung and Nietzsche, but he also likes name-dropping Dostoevsky, Plato, Solzhenitsyn, Milton, and Goethe. He interprets all of them as part of this grand project of determining how to live well, how to deal with the misery of existence and transmute it into something holy.

And on the one hand, of course they are. This is what every humanities scholar has been saying for centuries when asked to defend their intellectual turf. “The arts and humanities are there to teach you the meaning of life and how to live.” On the other hand, I’ve been in humanities classes. Dozens of them, really. They were never about that. They were about “explain how the depiction of whaling in Moby Dick sheds light on the economic transformations of the 19th century, giving three examples from the text. Ten pages, single spaced.” 

So maybe — just maybe — it’s not “read books, talk about books, write essays about books” that’s the problem. 

(Cross-posted at Text Patterns) 

speaking truth to the professoriate

I’m probably going to write more about Scott Alexander’s take on Jordan Peterson, but for now I just want to note that he’s absolutely right to say that Peterson is playing a role that (a) most of us who teach the humanities say we already fill but (b) few of us care about at all:

Peterson is very conscious of his role as just another backwater stop on the railroad line of Western Culture. His favorite citations are Jung and Nietzsche, but he also likes name-dropping Dostoevsky, Plato, Solzhenitsyn, Milton, and Goethe. He interprets all of them as part of this grand project of determining how to live well, how to deal with the misery of existence and transmute it into something holy.

And on the one hand, of course they are. This is what every humanities scholar has been saying for centuries when asked to defend their intellectual turf. “The arts and humanities are there to teach you the meaning of life and how to live.” On the other hand, I’ve been in humanities classes. Dozens of them, really. They were never about that. They were about “explain how the depiction of whaling in Moby Dick sheds light on the economic transformations of the 19th century, giving three examples from the text. Ten pages, single spaced.” And maybe this isn’t totally disconnected from the question of how to live. Maybe being able to understand this kind of thing is a necessary part of being able to get anything out of the books at all.

But just like all the other cliches, somehow Peterson does this better than anyone else. When he talks about the Great Works, you understand, on a deep level, that they really are about how to live. You feel grateful and even humbled to be the recipient of several thousand years of brilliant minds working on this problem and writing down their results. You understand why this is all such a Big Deal.

“Shakespeareomics”

Towards the end of last year I saw Benedict Cumberbatch playing Hamlet and then Justin Kurzel’s fantastic film version of Macbeth. In chatting about the performances with literary friends afterwards, I was struck by the similarities in research methodology employed in the humanities and science. Literary scholars dissect texts seeking clues which they reinforce with meta-data coming from biographical details of the author and historical facts about contemporary knowledge and beliefs.

“Shakespeareomics” – how scientists are unlocking the secrets of the Bard. Ah. So that’s what we do.

in which I sum up my posts on the recent controversies in academia

I have been trying for a while now, and in multiple locations, to articulate an argument about recent modes of student disaffection in American universities. I think there is a bright, strong thread linking the “trigger warning” debates of last year with the student protests of this year. In an ideal world I’d turn these thoughts into a short book, or at least a very long article, but for now I’m just going to have to link the posts together into a virtual unity.

I began by discussing the way the upbringing of today’s students may have encouraged them to think that the core function of adults, including their teachers and university administrators, is to protect them from discomfort.

I then argued that when these expectations are thwarted, or seem to be thwarted, students can become frustrated very quickly if they do not have good reason to trust their teachers; this is a primary cause of the demand for trigger warnings.

And that mistrust is exacerbated by the fact that, in general, American universities do not present themselves as places where one goes to seek wisdom, but as places where one goes to get credentials for future career success — a message students have received very clearly.

So when the universities seem not to be living up to their neoliberal promises, angry students don’t think of this as a situation that calls for political protests of the Sixties variety; rather, they are consumers upset about the product they have purchased, so they bypass the lower-level staff and complain to the managers.

And the managers (i.e. administrators) respond the way managers always respond when the customers complain.

But this is not an adequate response. Administrators and professors alike need to recall that one of their key tasks is to organize the university as a kind of mediating or transitional space between the Home and the Wide World that encourages students to develop a genuine public individuality.

This developmental process is not and cannot be perfectly safe: many of students’ core beliefs about self and world will come under challenge. But it can be done in a healthy way, as long as fears are properly acknowledged and dealt with; however, to return to an earlier theme, fear of harm can only be overcome when students have good reason to trust those who teach them.

As long as fear is greater than trust, it will be extremely difficult, if not impossible, to convince students that disagreement about foundational social and moral issues is not only acceptable, it is invaluable to individual and society alike. But to insist on this truth is the sine qua non of the current academic moment.

For this task, this insistence that there is something more and better than policing disagreement and building walls of separation between us and those who don’t see things our way, the humanities are invaluable: but they must recover some of their old moral robustness and commitment to the sovereign virtue of compassion.

If we want to get past this impasse of hostility and suspicion, we must remind ourselves, and then teach our students, that together we can travel better paths than that of neoliberal contractualism, which leads inevitably to code fetishism. We need not be such Baconian rationalists, such Weberian bureaucrats; and if we insist on living like that, if we forget that “there’s got to be a better way for people to live,” then all we have to look forward to is the academic equivalent of the shootout at the end of High Noon. But here in the real world there’s no way to tell who might win — if anyone does.

the last word on the humanities

The late Robert Nisbet used to tell a story about the academic scientist who was angrily accosted by his humanist colleague for “speaking against the humanities” at the previous day’s faculty meeting. Au contraire, said the other; he was doing nothing of the kind. “I love the humanities! I would die for the humanities! All I asked was — what the hell are the humanities?”

– “Defining the Humanities Up” by Wilfred M. McClay | Articles | First Things

But persistent or not, the myth of the unemployed humanities major is just that: a myth, and an easily disproven one at that. Georgetown University’s Center on Education and the Workforce has been tracking differences in the employment of graduates from various disciplines for years, demonstrating that all graduates see spikes and troughs in their employment prospects with the changing economy. And AAC&U’s employer surveys confirm, year after year, that the skills employers value most in the new graduates they hire are not technical, job-specific skills, but written and oral communication, problem solving, and critical thinking—exactly the sort of “soft skills” humanities majors tend to excel in.

Pax Scientia: Thanks, But I’ll Pass

Armand Marie Leroi is an evolutionary biologist — and also a scientific imperialist. No, that’s not an insult: it’s his own account of the matter.

Now, to be sure, Leroi says that in the conflict between science and the humanities “Hard words such as ‘imperialism,’ ‘scientism,’ and ‘vaulting ambition’ will be flung about,” because such words belong to “the vocabulary of anti-science.” But in the very same paragraph he claims that the only choices for the humanities are to pursue “a new kulturkampf” that they cannot win — because they are “weakened” by internal conflicts — or to “gratefully accept the peace imposed by science.” The really interesting word there is “imposed”: science is not offering peace, it is imposing it. Looks like for us humanists it’s Hobson’s choice.

And lest we think that that talk of “imposing” was an infelicitous turn of phrase, Leroi immediately extends it: “Under the Pax Scientia criticism will continue, but be tamed.” The imperium of science, or perhaps I should write Science, is today’s successor to that of Rome.

In Virgil’s Aeneid, the hero Aeneas descends into the underworld and meets the ghost of his father, who prophesies to him about the future of Rome. The “arts” of the Romans will be pacisque imponere morem, parcere subiectis, et debellare superbos — as Allen Mandelbaum renders it, “to teach the ways of peace to those you conquer, / to spare defeated peoples, tame the proud.” The language of taming in Leroi’s essay seems scarcely accidental.

So imperialism it is, then. I suppose I am supposed to be thankful that Leroi, in his great magnanimity, allows a barbarian, or perhaps a slave, like me to continue to do my work under the minatory tutelage of Science — especially since the alternative, I guess, is to end up like Spartacus and his fellow rebels. (That anti-Roman kulturkampf wasn’t such a great idea, guys.) After all, to offer any resistance whatsoever to the new imperium is to be “anti-science.”

spartacuscross

the last humanist

Now, to Leroi’s credit, he understands, at least in a rudimentary way, that the kind of criticism often practiced by humanists differs pretty strongly from what can be revealed by running the numbers: “When Edmund Wilson tells us that Sophocles’s ‘Philoctetes’ is a parable on the association between deformity and genius; or when Arthur Danto says that Mark Rothko’s ‘Untitled (1960)’ is simply about beauty, then we are, it seems, in a realm of understanding where numbers, and the algorithms that produce them, have no dominion.” (Though even here he seems to forget that algorithms don’t emerge ex nihilo but are written by people.)

But Leroi doesn’t seem to grasp that much criticism — and much of the criticism that has mattered the most — isn’t concerned with assigning a one-phrase summary of the “meaning” of an entire work of art, but is rather intensely focused on the details that are too small and too distinctive for algorithmic attention. When Keats writes, “Now more than ever seems it rich to die,” what does “rich” connote? Might it be ironic? (After all, the ironic use of “rich” — “Oh, that’s rich” — goes back to the seventeenth century.) No algorithm can ever tell, because algorithms aggregate, and the question here is about a single unrepeatable instance of a word. Nor can any aggregated information tell us anything about the torn cloth at the elbow of the disciple in Caravaggio’s Supper at Emmaus, or the bizarre alternations of the madly driven rhythms and ethereal voices in the Confutatis of Mozart’s Requiem.

All this is not to say that “distant reading” isn’t valuable — it is, and I have defended the work of digital humanists who work algorithmically against know-nothing critiques — but rather that it’s not the only kind of humanistic work that’s valuable, and that critics who attend to the specific and unrepeatable are doing, and will continue to do, intellectually serious work.

Maybe they’ll be paid for that work in the future; maybe not. People who care about such things will still continue to attend to it, whether their overlords like it or not. An essay like Leroi’s is written by people who have access to money that humanists can’t dream or, who expect to have access to that money forever, and who think it gives them imperial powers.

In a famous essay George Orwell wrote about the headmaster of his old prep school who would say to charity students like Orwell, “You are living on my bounty!” — that seems to be Leroi’s attitude toward humanists. But sorry, I’m not accepting the terms of peace Leroi would dictate — and I don’t think he can impose them after all. The war between Apollo and Hermes will continue.

And one more thing: that Roman imperium, that Pax Romana? They thought that would last forever too.

Computational Humanities and Critical Tact

Natalie Kane writes,

When we talk about algorithms, we never talk about the person, we talk about the systems they operate on, the systems they facilitate, and then the eventual consequences that fall when they supposedly ‘malfunction’. Throwing this out into twitter, a friend and talented creative technologist Dan Williams succinctly reminded me that we should always be replacing the word ‘algorithms’ with ‘a set of instructions written by someone.’ They are made, and written, by human beings, just like all technology is (a general statement, I know, but it all began with us). By removing that element when talking about them we cease to have any understanding of the culture and social systems behind it, which arguably are the reasons why algorithms are having such a huge impact on our lives. If we’re going to have constructive conversations about the cultural and societal impact of algorithmically mediated culture, we have to always remember that there are humans in it, and not just at the other end of the black box feeling the vibrations. As Matthew Plummer-Fernandez pointed out in a panel on Data Doubles this year, the problems with confronting complex computation systems demands a socio-technical, not a purely technical solution.

This reminds me of a recent post I wrote about Andrew Piper’s current project and other projects in the digital (or computational) humanities. I’ve been thinking a lot about how to evaluate projects such as Andrew’s, and Kane’s post made me realize the extent to which I default to thinking about outcomes, results, answers — and not so much about the originating questions, the “set of instructions written by someone.”

Twenty years ago, Mark Edmundson commented that “Paul de Man is a more difficult critic to read and understand than a responsive, rather impressionistic essayist like Virginia Woolf. But one can teach intelligent students to do what de Man does; it is probably impossible to teach anyone to respond like Woolf if he has little aptitude for it.” Edmundson goes on to ask, “And how could we sustain the academic study of literature if we were compelled to say that a central aspect of criticism is probably not teachable?” — but while that’s an important question it’s not the one I want to pursue right now.

Rather, I’d like to focus on the indefinable skill he’s referring to here, which is, I think, what some people used to call critical tact — the knack for perceiving what themes or questions will, upon pursuit, yield fruitful thoughts. It seems to me that this tact is no less valuable in computational humanities work than in any other kind of critical work, and that we would do well to think about what the best questions look like, what shape or form they tend to take.

Edmundson is probably right to suggest that critical tact can’t be taught, but it can nevertheless be learned (no one is born with it). And maybe one way people in DH could learn it is by spending a lot of time thinking about and debating this: Who asks the best questions? Who writes the most provocative and fruitful instructions?

On Not Defending the Humanities

I don’t think that the humanities or the liberal arts can be defended, at least not in the sense that most people give to “defended.” Here’s why, starting with three texts on which I will build my explanation.

The English theologian Austin Farrer used to say that some Christian doctrines — he was thinking especially of the hypostatic union — cannot be defended, but can be homiletically expounded.

Similarly, in Whose Justice? Which Rationality? Alasdair MacIntyre writes,

In systematizing and ordering the truths they take themselves to have discovered, the adherents of a tradition may well assign a primary place in the structures of their theorizing to certain truths and treat them as first metaphysical or practical principles. But such principles will have had to vindicate themselves in the historical process of dialectical justification… Such first principles themselves, and indeed the whole body of theory of which they are a part, themselves will be understood to require justification. The kind of rational justification which they receive is at once dialectical and historical. They are justified insofar as in the history of this tradition they have, by surviving the process of dialectical questioning, vindicated themselves as superior to their historical predecessors.

And in The Abolition of Man C. S. Lewis writes,

Those who understand the spirit of the Tao and who have been led by that spirit can modify it in directions which that spirit itself demands. Only they can know what those directions are. The outsider knows nothing about the matter. His attempts at alteration, as we have seen, contradict themselves. So far from being able to harmonize discrepancies in its letter by penetration to its spirit, he merely snatches at some one precept, on which the accidents of time and place happen to have riveted his attention, and then rides it to death — for no reason that he can give. From within the Tao itself comes the only authority to modify the Tao. This is what Confucius meant when he said ‘With those who follow a different Way it is useless to take counsel’. This is why Aristotle said that only those who have been well brought up can usefully study ethics: to the corrupted man, the man who stands outside the Tao, the very starting point of this science is invisible. He may be hostile, but he cannot be critical: he does not know what is being discussed…. Outside the Tao there is no ground for criticizing either the Tao or anything else.

I think that these passages, read rightly, suggest a few things. First, that it’s probably impossible to defend the artes liberales, or the studia humanitatis, to people who are firmly outside their Tao — who simply do not acknowledge the value of the foundational commitments that have shaped the tradition. The person who relentlessly demands to know what kind of job a liberal-arts education will get him “may be hostile, but he cannot be critical.” And this is not a temporary or trivial impediment that can be maneuvered around; it’s an immoveable object.

But apologetics is not the only mode of suasion. In some cases the more appropriate rhetoric relies on narration and exposition: perhaps something as simple as telling the story of what we do. There are many places around the country where older models of the humanities are flourishing, even as those who have rejected this Tao are floundering — those older models having, as it were, vindicated themselves as superior to their historical successors, or would-be successors. But what happens in these institutions, in these classrooms, is simply invisible to people who question the value of the humanities. Rendering the invisible visible might be one of the best services those of us following those models could perform in “defense” of our practices.

I’m teaching a class called Philosophy Versus Literature and right now we’re working through Lucretius. Yesterday we talked about philosophy as therapy — drawing on Martha Nussbaum, among others — and the odd but, in the end, strong logic that leads Lucretius (following Epicurus) to believe that physics is first philosophy, that understanding the constitution of the world is the necessary first step towards being liberated from fear and unnecessary pain. Next time I’m going to tell the students about Stephen Greenblatt’s (very bad) book The Swerve, and ask them why a book about the early modern recovery of Lucretius became a bestseller and award-winner in 21st-century America. These conversations concern matters ancient and permanent and are also about as relevant as relevant can be, if that happens to be one of your criteria for educational value. Moreover, De Rerum Natura is a book that runs pretty strongly against the grain of all that my students believe and hope — but that does not deter them nor me from treating it with the utmost attentiveness.

I can’t defend the value of this kind of exercise in some kind of abstract, characterless intellectual space; nor am I inclined to. It makes sense within the Tao; and if you want to see what kind of sense it makes you have to think and act within the Tao. You have to take it on as a living option, not a dessicated proposition. To those who doubt the value of what I do, I probably have little more to say than: Taste and see. (But I’ll use more words to say it.)

Two kinds of critics in the world….

I’m very interested in Andrew Piper’s new work on the “conversional novel”:

My approach consisted of creating measures that tried to identify the degree of lexical binariness within a given text across two different, yet related dimensions. Drawing on the archetype of narrative conversion in Augustine’s Confessions, my belief was that “conversion” was something performed through lexical change — profound personal transformation required new ways of speaking. So my first measure looks at the degree of difference between the language of the first and second halves of a text and the second looks at the relative difference within those halves to each other (how much more heterogenous one half is than another). As I show, this does very well at accounting for Augustine’s source text and it interestingly also highlights the ways in which novels appear to be far more binarily structured than autobiographies over the same time period. Counter to my initial assumptions, the ostensibly true narrative of a life exhibits a greater degree of narrative continuity than its fictional counterpart (even when we take into account factors such as point of view, length, time period, and gender).

Now — and here’s the really noteworthy part — Piper defines “conversional” largely in terms of lexical binaries. So a novel in which a religious conversion takes place — e.g., Heidi — is no more conversional than a novel whose pivot is not religious but rather involves a move from nature to culture or vice versa — e.g., White Fang.

As I say, I’m interested, but I wonder whether Piper isn’t operating here at a level of generality too high to be genuinely useful. Computational humanities in this vein strikes me as a continuation — a very close continuation — of the Russian formalist tradition: Vladimir Propp’s Morphology of the Folk Tale — especially the section on the 31 “functions” of the folk tale — seems to be a clear predecessor here, though perhaps the more radical simplifications of Greimas’s semiotic square are even more direct a model.

Two thoughts come to my mind at this point:

  1. I would love to see what would happen if some intrepid computational humanists got to work on filtering some large existing of corpus of texts through the categories posited by Propp, Greimas, et al. Were the ideas that emerged from that tradition arbitrary and factitious, or did they draw on genuine, though empirically unsupported, insights into how stories tend to work?
  2. A good many people these days emphasize the differences between traditional and computational humanists, but the linkage I have just traced between 20th-century formalists and Piper’s current work makes me wonder if the more important distinction isn’t Darwin’s opposition of lumpers and splitters — those who, like Piper in this project, are primarily interested in what links many works to one another versus those who prefer to emphasize the haecceity of an individual thing … which of course takes us back to the ancient universals/particulars debate, but still….

UPDATE: via Ted Underwood on Twitter, I see some people are already moving towards Propp-inspired ways of doing DH: see here and here.

The University of Chicago Press is pleased to announce the launch of History of Humanities, a new journal devoted to the historical and comparative study of the humanities. The first issue will be published in the spring of 2016.

History of Humanities, along with the newly formed Society for the History of the Humanities, takes as its subject the evolution of a wide variety of disciplines including archaeology, art history, historiography, linguistics, literary studies, musicology, philology, and media studies, tracing these fields from their earliest developments, through their formalization into university disciplines, and to the modern day. By exploring these subjects across time and civilizations and along with their socio-political and epistemic implications, the journal takes a critical look at the concept of humanities itself.

Chicago to Publish New Journal: History of Humanities. I’m quite interested in this journal and look forward to reading it, but NB: of the 49 (!) Editors and Associate Editors, there is only one scholar of religion — a professor of Islamic Intellectual History — and no one in biblical studies or theology. And yet those disciplines have had some role to play in the history of the humanities, I dare say.

No matter the text that he’s ostensibly engaged with, Mr. Keating, like Hamlet in Stéphane Mallarmé’s wonderful description, is forever “reading in the book of himself.” This is what Keating’s namesake John Keats (referencing Wordsworth) called the “egotistical sublime.” Recently, some pioneering work in neuroscience has begun to suggest what English teachers have long known: that the power of literature is the power of alterity, creating the possibility of encountering the other in a form not easily recuperable, not easily assimilable to the self. “Imaginative sympathy,” we used to call it. To read literature well is to be challenged, and to emerge changed.

But for Keating, it’s the text (like Frost’s poem) that is changed, not the reader. He does the same thing to the Whitman poem “O Me! O Life!” that he recites to his students. Used as the voiceover for a recent iPad ad, Mr. Keating’s pep talk quotes the opening and closing lines of the poem, silently eliding the middle: “Oh me! Oh life! / of the questions of these recurring, / Of the endless trains of the faithless, of cities fill’d with the foolish, /…/ What good amid these, O me, O life? // Answer. // That you are here—that life exists and identity, / That the powerful play goes on, and you may contribute a verse.” He’s quoting from Whitman, he says, but the first line he omits is telling: “Of myself forever reproaching myself, (for who more foolish than I, and who more faithless?).” Go back and add that line to the quotation and see how it alters the whole. For Keating—and one fears, examining the scant evidence the film provides, for his students—every poem is a Song of Myself. This, then, is what’s at stake in Keating’s misreadings—I’m not interested simply in catching a fictional teacher out in an error. But he misreads both Frost and Whitman in such a way that he avoids precisely that encounter with the other, finding in poetry only an echo of what he already knows—what he’s oft thought, but ne’er so well expressed.

Dead Poets Society Is a Terrible Defense of the Humanities. The funny thing is, I think the movie actually demonstrates some of what’s wrong with Keating’s approach, but I don’t think I’ve ever met a fan of the film who notices that.

my course on the “two cultures”

FOTB (Friends Of This Blog), I have a request for you. This fall I’m teaching a first-year seminar for incoming Honors College students, and our topic is the Two Cultures of the sciences and the humanities. We’ll begin by exploring the lecture by C. P. Snow that kicked off the whole debate — or rather, highlighted and intensified a debate had already been going on for some time — and the key responses Snow generated (F. R. Leavis, Lionel Trilling, Loren Eiseley). We’ll also read the too-neglected book that raised many of the same issues in more forceful ways, and a few years before Snow, Jacob Bronowski’s Science and Human Values.

Then we’ll go back to try to understand the history of the controversy before moving forward to consider the forms it is taking today. Most of the essays I’ll assign may be found by checking out the “twocultures” tag of my Pinboard bookmarks, but we’ll also be taking a detour into science/religion issues by considering Stephen Jay Gould’s idea of non-overlapping magisteria and some of the responses to it.

What other readings should I consider? I am a bit concerned that I am presenting this whole debate as one conducted by white Western men — Are there ways of approaching these questions by women or people from other parts of the world that might put the issues in a different light? Please make your recommendations in the comments below or on Twitter.

Thanks!

Reposted from Text Patterns

Four years after the original Nature paper was published, Nature News had sad tidings to convey: the latest flu outbreak had claimed an unexpected victim: Google Flu Trends. After reliably providing a swift and accurate account of flu outbreaks for several winters, the theory-free, data-rich model had lost its nose for where flu was going. Google’s model pointed to a severe outbreak but when the slow-and-steady data from the CDC arrived, they showed that Google’s estimates of the spread of flu-like illnesses were overstated by almost a factor of two.

The problem was that Google did not know – could not begin to know – what linked the search terms with the spread of flu. Google’s engineers weren’t trying to figure out what caused what. They were merely finding statistical patterns in the data. They cared about ­correlation rather than causation. This is common in big data analysis. Figuring out what causes what is hard (impossible, some say). Figuring out what is correlated with what is much cheaper and easier. That is why, according to Viktor Mayer-Schönberger and Kenneth Cukier’s book, Big Data, “causality won’t be discarded, but it is being knocked off its pedestal as the primary fountain of meaning”.

But a theory-free analysis of mere correlations is inevitably fragile. If you have no idea what is behind a correlation, you have no idea what might cause that correlation to break down. One explanation of the Flu Trends failure is that the news was full of scary stories about flu in December 2012 and that these stories provoked internet searches by people who were healthy. Another possible explanation is that Google’s own search algorithm moved the goalposts when it began automatically suggesting diagnoses when people entered medical symptoms.

Google Flu Trends will bounce back, recalibrated with fresh data – and rightly so. There are many reasons to be excited about the broader opportunities offered to us by the ease with which we can gather and analyse vast data sets. But unless we learn the lessons of this episode, we will find ourselves repeating it.

Big data: are we making a big mistake? – FT.com Fantastic essay by Tim Harford. I have a feeling that again and again the big data firms will insist that the data can “speak for itself” — simply because data gathering is what they can do.

There should be a lesson here also for those who believe that Franco-Moretti-style “distant reading” can transform the humanities. The success of that endeavor too will be determined not by text-mining power but by the incisiveness of the questions asked of that data and shrewdness of conclusions drawn about it. What T. S. Eliot said long ago — “The only method is to be very intelligent” — remains just as true in an age of Big Data as it was when Eliot uttered it.

Defenders of the humanities claim a special role in training citizens for a democratic society and often have deeply felt convictions about democratizing knowledge and including new voices. The mainstream of humanities research has, however, focused upon virtuoso scholarship, published in subscription publications to which only academics have access, and composed for small networks of specialists who write letters for tenure and promotion and who meet each other for lectures and professional gatherings. Students in the humanities remain, to a very large degree, subjects of a bureaucracy of information, where they have no independent voice and where they never move beyond achieving goals narrowly defined by others. The newly re-engineered sciences have reorganized themselves to give students a substantive voice in the development of knowledge and to become citizens in a republic of learning. The STEM disciplines certainly have not fully realized these lofty ideals but they are far ahead of most of their colleagues in the humanities and in Greek and Latin studies.

For Thackeray and his contemporaries, literature is a public matter, a matter to be lectured upon before large audiences, a matter to be given importance because of its impact upon morals and emotions. For the present-day academic critic, literature no longer is a public matter but rather is a professional matter, even more narrowly, a departmental matter. The study of literature has become a special and separate discipline — housed in colleges of arts and sciences along with other special and separate disciplines. The public has narrowed to a group of frequently recalcitrant students whose need for instruction in English composition — not in English literature — justifies the existence of the English department… . People who want to become English professors do so because, at one point in their lives, they found reading a story, poem, or play to be an emotionally rewarding experience. They somehow, someway were touched by what they read. Yet it is precisely this emotional response that the would-be professor must give up. Of course, the professor can and should have those feelings in private, but publicly, as a teacher or publisher, the professor must talk about the text in nonemotional, largely technical terms. No one ever won a National Endowment for the Humanities grant by weeping copiously for Little Nell, and no one will get tenure in a major department by sharing his powerful feelings about Housman’s Shropshire Lad with the full professors.

— Brian McRae, from Addison and Steele Are Dead (1991)

The other thing for which I am grateful to philosophy is that, at least in the world in which I first sought to make a name for myself, one was required to write clearly, concisely, and logically. Wittgenstein said that whatever can be said can be said clearly, and that became something of a mantra for my generation. At one time, the British journal Analysis sponsored regular competitions: some senior philosopher propounded a problem, which one was required to solve in 600 words or less, the winner receiving as a prize a year’s subscription to the magazine. Here is an example of the kind of problem, propounded by J. L. Austin, that engaged Analysis’s subscribers: “What kind of ‘if ’ is the ‘if ’ in ‘I can if I choose?’ ” (Hint: it cannot be the truth-conditional “if ” of material implication, as in, “If p, then q.”)

I tried answering all the problems, and never won a prize. But the exercise taught me how to write. The great virtues of clarity, concision, and coherence, insisted upon throughout the Anglo-American philosophical community, have immunized the profession against the stylistic barbarity of Continental philosophy, which, taken up as it has been since the early 1970s by the humanistic disciplines—by literary theory, anthropology, art history, and many others—has had a disastrous effect, especially on academic culture, severely limiting the ability of those with advanced education to contribute to the intellectual needs of our society. It is true that analytical philosophers, reinforced by the demands of their profession to work within their constricting horizons, have not directly served society by applying their tools to the densely knotted problems of men, to use Dewey’s term for where the energies of philosophy should be directed. At one point it became recognized that “clarity is not enough.” It is not enough. But the fact that it remains a stylistic imperative in most Anglo-American philosophy departments means that these virtues are being kept alive against the time when the humanities need to recover them.

— Arthur C. Danto (available to subscribers only, I think)

So Philology: to preserve, monitor, investigate, and augment our cultural inheritance, including the various material means by which it has been realized and transmitted. The scholar Aleida Assmann recently said of the philologian’s archive (as opposed to the critic’s and scholar’s canon) that it is populated by materials that “have lost their immediate addresses; they are decontextualized and disconnected from the former frames which had authorized them or determined their meaning. As part of the archive, they are open to new contexts and lend themselves to new interpretations.” So we may arrive at “counter-memories” and “history against the grain.” But in order to lie so open, they must be held at a level of attention that is consciously marked as useless and nonsignificant. There need be no perceived use or exchange value in the philological move to preserve beyond the act of preservation itself. Value is not only beyond present conception, it is understood that it may never again acquire perceived value. “Never again” is crucial. For the philologian, materials are preserved because their simple existence testifies that they once had value, though what that was we may not — may never — know. As the little girl in Wordsworth’s “We are Seven” understood, the dead are precious as such. If we are living human lives, they are not even dead.

Space colonies. That’s the latest thing you hear, from the heralds of the future. President Gingrich is going to set up a state on the moon. The Dutch company Mars One intends to establish a settlement on the Red Planet by 2023. We’re heading towards a “multi-planetary civilization,” says Elon Musk, the CEO of SpaceX. Our future lies in the stars, we’re even told.

As a species of megalomania, this is hard to top. As an image of technological salvation, it is more plausible than the one where we upload our brains onto our computers, surviving forever in a paradise of circuitry. But not a lot more plausible. The resources required to maintain a colony in space would be, well, astronomical. People would have to be kept alive, indefinitely, in incredibly inhospitable conditions. There may be planets with earthlike conditions, but the nearest ones we know about, as of now, are 20 light years away. That means that a round trip at 10 percent the speed of light, an inconceivable rate (it is several hundred times faster than anything we’ve yet achieved), would take 400 years.

But never mind the logistics. If we live long enough as a species, we might overcome them, or at least some of them: energy from fusion (which always seems to be about 50 years away) and so forth. Think about what life in a space colony would be like: a hermetically sealed, climate-controlled little nothing of a place. Refrigerated air, synthetic materials, and no exit. It would be like living in an airport. An airport in Antarctica. Forever. When I hear someone talking about space colonies, I think, that’s a person who has never studied the humanities. That’s a person who has never stopped to think about what it feels like to go through an average day—what life is about, what makes it worth living, what makes it endurable. A person blessed with a technological imagination and the absence of any other kind.

I had been interviewing her for the book I wrote about the Holocaust, and by the time I had the brief conversation I am about to tell you about, I knew of some of the things she had suffered, and I can tell you that they were the kind of things that strip all sentimentality away, that do not leave you with any mushy illusions about the value of Western culture, of “civilization” and its traditions. I just want you to know that before I tell you what she told me, which was this:

We had spent a long day together, talking about sad things — talking, in fact, about the erasure of a culture, the unmaking of a civilization, as total and final as what Vergil described in Book 2 of the Aeneid; there are, after all, as many Jews left today in the city this lady had lived in as there are Trojans in Troy. We were tired, night was falling. Because I didn’t want to leave her with sad thoughts that night, any more than I want to leave you with sad thoughts today, I tried to lead the conversation to a happier time.

“So what happened when the war was over?” I asked softly. “What was the first thing that happened, once things started to be normal again?”

The old lady, whose real name had disappeared in the war along with her parents, her house, and nearly everything else she had known, was now called Mrs. Begley. When I asked her this question Mrs. Begley looked at me; her weary expression had kindled, every so slightly. “You know, it’s a funny thing,” she told me. “When the Germans first came, in ‘41, the first thing they did was close the theaters.”

“The theaters?,” I echoed, a bit confused, not dreaming where this could be leading. I didn’t know which theaters she meant; I thought, briefly, of the great Beaux Arts Pantheon of an opera house in Lwów, with its Muse-drawn chariots and gilded victories, a mirage of civilization, the 19th century’s dream of itself. She had told me, once, that she had seen Carmen there, when she was a newlywed. But she wasn’t talking, now, about operas in Lwów, or about the 1930s; she was talking about theaters in Kraków, where she and her husband and child ended up once the war was over.

“Yes,” she said, sharply, as if it ought to have been obvious to me that the first thing you’d do, if you were about to end a civilization, would be to put an end to playgoing, “the theaters. The first thing they did was close the theaters. And I’ll tell you something, because I remember it quite clearly: the first thing that happened, after the war was over and things got a little normal — the first thing was that the actors and theater people who were still alive got together and put on, in Polish, a production of Sophocles’ Antigone.”

— UC Berkeley Classics Department: 2009 Commencement Address by Daniel Mendelsohn. Please, please read the whole thing, if you have any interest in what makes the humanities human.

Well, I’m basically a literary and philosophical humanist myself, not a journalist or scholar or expert of any kind, so I do personally regret that people like me don’t have and never again will have the cultural authority that the New York intellectuals had. But history has moved on, and there’s still a place, after all, for us humanists to practice the honorable activity of applying the really matchless moral resources of literature and the philosophical tradition to criticizing society and culture. Still, the work on the front lines now needs to be done by others, the investigative journalists and maverick scholars, people who can do deep reporting or­­­­­­­­ work in the archives. Those are activities that classical public intellectuals—Bourne, Russell, Camus, Sartre, Silone, Nicola Chiaromonte—didn’t have the time or the temperament for. So even though I personally will never be a Glenn Greenwald or a Noam Chomsky, I’m supremely grateful to them. They’re doing what I think needs above all to be done. Cultivating and expounding the humanities will always be essential to the moral health of society. But politically speaking, the literary intellectual is now a kind of auxiliary

George Scialabba. Someday I’ll find time to explain why I don’t agree.

In English we speak about science in the singular, but both French and German wisely retain the plural. The enterprises that we lump together are remarkably various in their methods, and also in the extent of their successes. The achievements of molecular engineering or of measurements derived from quantum theory do not hold across all of biology, or chemistry, or even physics. Geophysicists struggle to arrive at precise predictions of the risks of earthquakes in particular localities and regions. The difficulties of intervention and prediction are even more vivid in the case of contemporary climate science: although it should be uncontroversial that the Earth’s mean temperature is increasing, and that the warming trend is caused by human activities, and that a lower bound for the rise in temperature by 2200 (even if immediate action is taken) is two degrees Celsius, and that the frequency of extreme weather events will continue to rise, climatology can still issue no accurate predictions about the full range of effects on the various regions of the world. Numerous factors influence the interaction of the modifications of climate with patterns of wind and weather, and this complicates enormously the prediction of which regions will suffer drought, which agricultural sites will be disrupted, what new patterns of disease transmission will emerge, and a lot of other potential consequences about which we might want advance knowledge. (The most successful sciences are those lucky enough to study systems that are relatively simple and orderly. James Clerk Maxwell rightly commented that Galileo would not have redirected the physics of motion if he had begun with turbulence rather than with free fall in a vacuum.)

The emphasis on generality inspires scientific imperialism, conjuring a vision of a completely unified future science, encapsulated in a “theory of everything.” Organisms are aggregates of cells, cells are dynamic molecular systems, the molecules are composed of atoms, which in their turn decompose into fermions and bosons (or maybe into quarks or even strings). From these facts it is tempting to infer that all phenomena—including human actions and interaction—can “in principle” be understood ultimately in the language of physics, although for the moment we might settle for biology or neuroscience. This is a great temptation. We should resist it. Even if a process is constituted by the movements of a large number of constituent parts, this does not mean that it can be adequately explained by tracing those motions

The Trouble with Scientism. An incredibly important and provocative essay by Philip Kitcher.

I’ve increasingly felt that digital journalism and digital humanities are kindred spirits, and that more commerce between the two could be mutually beneficial. That sentiment was confirmed by the extremely positive reaction on Twitter to a brief comment I made on the launch of Knight-Mozilla OpenNews, including from Jon Christensen (of the Bill Lane Center for the American West at Stanford, and formerly a journalist), Shana Kimball (MPublishing, University of Michigan), Tim Carmody (Wired), and Jenna Wortham (New York Times).

Here’s an outline of some of the main areas where digital journalism and digital humanities could profitably collaborate. It’s remarkable, upon reflection, how much overlap there now is, and I suspect these areas will only grow in common importance.

Dan Cohen’s Digital Humanities Blog » Blog Archive » Digital Journalism and Digital Humanities. Dan’s list of points of convergence is extremely smart and extremely provocative.

Humanists can be private educators and public spies. But the latter role is far too rare, because humanist intellectuals do not see themselves as practitioners of daily life. Their disparagement comes largely from their own isolation within the institutions that reproduce them, a fate many humanists despise out of one side of their mouths while endorsing it with the other. The humanist corner of the university becomes, in Palmquist’s words, “just a safe haven for half-witted thinkers to make a comfortable living.”

The humanities needs more courage and more contact with the world. It needs to extend the practice of humanism into that world, rather than to invite the world in for tea and talk of novels, only to pat itself on the collective back for having injected some small measure of abstract critical thinking into the otherwise empty puppets of industry. As far as indispensability goes, we are not meant to be superheroes nor wizards, but secret agents among the citizens, among the scrap metal, among the coriander, among the parking meters. We earn respect by calling in worldly secrets, by making them public. The worldly spy is the opposite of the elbow-patched humanist, the one never out of place no matter the place. The traveler at home everywhere, with the luxury to look.

Only when the humanities can earn their own keep will they be respected in modern America. And that will only happen when you convince the majority of people to be interested, of their own volition, rather than begging or guilting them into giving you that money to translate your obscure French poem on vague grounds of “caring about culture.” So either figure something out, or shut up and accept that the humanities are an inherently elite activity that will rely on feudal patronage. Just like they always have. (If you think of Maslow’s hierarchy, it’s obvious why the leisure class, which generally has money, sex, food, and security taken care of, has been in charge of learning.)

You have no idea how much it pains me to say this, but speaking from experience I now believe that private industry is doing a better job of communicating, persuading, innovating, of everything the university has stopped doing. I do not take this as indicator of how well capitalism works, I take it as an indicator of how badly universities have failed, while still somehow aping the worst aspects of corporate capitalism.

Unfortunately, however, traditions that are not passed on from one generation to the next die. If an entire generation grows up largely unexposed to a particular tradition, then that tradition can in essence be said to be dead, because it is no longer capable of reproducing itself. It does not matter whether the tradition in question is imagined as the Western tradition, the Christian tradition, or the Marxist tradition (and of course both Christianity and Marxism are part of the Western tradition). Traditions are like languages: if they are not passed on, they die. Most traditions, of course, have good and bad elements in them (some might argue for Christianity, some for Marxism, relatively few for both), and what dies when a tradition dies is therefore often both good and bad, no matter what one’s perspective. But what also dies with a tradition is any possibility of self-critique from within the tradition (in the sense that Marxism, for instance, constituted a self-critique from within the Western tradition), since a tradition’s self-critique presupposes the existence of the tradition. Therefore the death of a tradition is not just the death of the oppression and tyranny that might be associated with the tradition, but also the death of progressive and liberating impulses within the tradition.

— Views: Sorry – Inside Higher Ed. I like much of this, but it is very, very wrong to suggest that there is a Western tradition — there is not even a Christian tradition. Western culture is largely a series of arguments conducted by proponents of different traditions that had their origin in or near the Western orbit. Christianity likewise has, intellectually speaking, been an extended debate about what it means to follow Jesus Christ and to belong to Him.

The great thing about art is that it’s there whether the academic humanities are or not. And if you make the study of literature and the arts your own, which is incredibly easy to do with the internet being what it is, it means so much more than if you passively imbibe received wisdom in classrooms for grades. The academic humanities may have committed slow suicide – but the art is alive and well, much of it many centuries old. It’s freely available to us. It’s ours.

The American corporate model looks a little battered at the moment, while American universities have become paragons of learning to which all the world aspires. Does it really make sense to refashion Harvard in the image of GM or BP? For all the problems tenure causes, it has proved its value over time—and not only, or mainly, as a way of protecting free speech. Sometimes, basic research in humanities, social science and natural science pays off quickly in real-world results. More often, though, it takes a generation or so for practical implications to become clear. That’s how long it took, for example, for new research (most of it done in universities) which showed how central slavery was to both the life of the South and the outbreak of the Civil War, to transform the way public historians present the American past at historical sites. That’s how long it will probably take for the genomic research that is currently exploding to have a practical impact on medical treatment. Basic research doesn’t immediately fatten the bottom line, even in the fiscal quarter when results are announced. Many corporations have cut or withdrawn their support for it, on strictly economic grounds. In earlier decades, AT&T (later Lucent Technologies), RCA, Xerox and other industrial companies did a vast amount of basic research. AT&T’s Bell Labs, for one, created the transistor and the photovoltaic cell, and mounted the first TV and fax transmissions. But funding fell and corporate priorities changed—and they have shrunk in every sense ever since. Just one thousand employees walk the darkened corridors of Bell Labs, down from a staff of thirty thousand in 2001. We need universities, and tenured professors, to carry on the basic research that most corporations have abandoned. What we don’t need is for universities to adopt the style of management that wrecked the corporate research centers.

As for the argument that the humanities don’t pay their own way, well, I guess that’s true, but it seems to me that there’s a fallacy in assuming that a university should be run like a business. I’m not saying it shouldn’t be managed prudently, but the notion that every part of it needs to be self-supporting is simply at variance with what a university is all about. You seem to value entrepreneurial programs and practical subjects that might generate intellectual property more than you do ‘old-fashioned’ courses of study. But universities aren’t just about discovering and capitalizing on new knowledge; they are also about preserving knowledge from being lost over time, and that requires a financial investment. There is good reason for it: what seems to be archaic today can become vital in the future. I’ll give you two examples of that. The first is the science of virology, which in the 1970s was dying out because people felt that infectious diseases were no longer a serious health problem in the developed world and other subjects, such as molecular biology, were much sexier. Then, in the early 1990s, a little problem called AIDS became the world’s number 1 health concern. The virus that causes AIDS was first isolated and characterized at the National Institutes of Health in the USA and the Institute Pasteur in France, because these were among the few institutions that still had thriving virology programs. My second example you will probably be more familiar with. Middle Eastern Studies, including the study of foreign languages such as Arabic and Persian, was hardly a hot subject on most campuses in the 1990s. Then came September 11, 2001. Suddenly we realized that we needed a lot more people who understood something about that part of the world, especially its Muslim culture. Those universities that had preserved their Middle Eastern Studies departments, even in the face of declining enrollment, suddenly became very important places. Those that hadn’t – well, I’m sure you get the picture.

The rigid scripting of childhood and adolescence has made young Americans risk- and failure-averse. Shying away from endeavors at which they might not do well, they consider pointless anything without a clear application or defined goal. Consequently, growing numbers of college students focus on higher education’s vocational value at the expense of meaningful personal, experiential, and intellectual exploration. Too many students arrive at college committed to a pre-professional program or a major that they believe will lead directly to employment after graduation; often they are reluctant to investigate the unfamiliar or the “impractical”, a pejorative typically used to refer to the liberal arts. National education statistics reflect this trend. Only 137 of the 212 liberal arts colleges identified by economist David Breneman in his 1990 article “Are we losing our liberal arts colleges?” remain, and an American Academy of Arts and Sciences study reported that between 1966 and 2004, the number of college graduates majoring in the humanities had dwindled from 18 percent to 8 percent.

Ironically, in the rush to study fields with clear career applications, students may be shortchanging themselves. Change now occurs more rapidly than ever before and the boundaries separating professional and academic disciplines constantly shift, making the flexibility and creativity of thought that a liberal arts education fosters a tremendous asset. More importantly, liberal arts classes encourage students to investigate life’s most important questions before responsibilities intervene and make such exploration unfeasible. More time spent in college learning about the self and personal values means less floundering after graduation. Despite the financial or, in some cases, immigration-status reasons for acquiring undergraduate vocational training, college still should be a time for students to broaden themselves, to examine unfamiliar ideas and interests, and to take intellectual risks. Otherwise, students graduate with (now dubious) career qualifications but little idea of who they are or who they want to be.

css.php