...

Stagger onward rejoicing

Tag: history (page 2 of 3)

Palaiphobia

I’ve been thinking a lot lately about the idea of the present moment as a Power, a power in the Pauline sense of a massively distributed, massively influential, universal agency directing the course of this world. It seems to me that the Present is a jealous God: it wants us to think only of itself, and never of the past or the future except insofar as images of them serve this instant.

But if we must think of either the past or the future, the Present prefers for us to think of the future, for two reasons: one, any future we imagine is just that, imaginary, and is really a kind of projection of our hopes and fears for our own moment. And two, thinking about the future produces anxiety, which has the effect of driving us back towards the Present, where we can be distracted from that anxiety. That is to say, the future is potentially useful to the Present in a way that the past is not.

Now, to be sure, we can interpret the past in such a way that we reinforce our current habits and attitudes and prejudices. (I have written about this in Breaking Bread with the Dead.) But this is of limited usefulness to the Present and is not really worth the risks. From the perspective of the Present, any genuine immersion in the past is likely to complicate our understanding of this moment and make it harder for us to know precisely what to do, because of all the complexity, good and bad, of the behavior of those who lived and fought and prayed and loved before us. If we learn to have compassion for those people, we just might translate that into compassion for the people who share this world with us but do not think just as we think. And that the Present cannot have.

Why does the Present not want this? Because present-mindedness is instantaneousness, it is automatic response, it is the gratification of whatever emotion happens to arise. As the poet Craig Raine has said, “all emotion is pleasurable” — this fact is the constant pole star of the Present.

These thoughts, though they’ve occupied me for a long time, were recently brought to the forefront of my mind by an essay on Harper Lee by Casey Cep, which contains this passage:

There is an important and interesting conversation happening now about the relevance of To Kill a Mockingbird to our country’s pursuit of racial justice and how we teach civic virtues like tolerance. For a long time, Lee’s novel has been one of the most banned books in the country, first criticized by conservatives who disapproved of its integrationist politics, then by liberals who disapproved of its use of racial slurs, and all along by censors of all persuasions who object to its depiction of rape and incest. Lately, though, the novel’s detractors are not calling for a ban or censorship, just retirement: taking it off of syllabi in order to make room for books by a more diverse group of authors, offering students work written with an eye to the current fight for racial justice, not one from the last century.

I don’t really care whether people keep reading To Kill a Mockingbird. What interests me about this paragraph is the idea — and it’s not necessarily Cep’s idea, just one that she rightly discerns as common — that there is a “current fight for racial justice” that’s different from “one from the last century.” But, you know, Dorothy Counts is still alive.

counts

 

And Ruby Bridges is still alive — indeed, just now reaching retirement age.

ruby

 

John Lewis, the last of the Big Six, just died a few months ago. I myself remember quite vividly the integration of Birmingham’s schools. This isn’t ancient history we’re talking about, and we shouldn’t allow the artificial convention of “centuries” deceive us into believing that it is. The story of Dorothy Counts and Ruby Bridges and John Lewis and all the rest of those amazing people who now get lumped into that comforting abstraction we call the Civil Rights Movement is our story, though the Present wants us to forget that, wants to separate us from our brothers and sisters, wants to break all chains that link us to one another — so that we can be wholly absorbed into Now and indulge our instantaneous emotions rather than reflect thoughtfully on the ways that the past is not dead, it is not even past.

The Present wants to infect us with what I have decided to call palaiphobia, from παλαιός, palaiós, old, worn out. That’s how it alienates us from one another, makes us wholly dependent on what it can offer: sentimentality and rage.


UPDATE: My friend Adam Roberts has, quite justifiably, wondered whether my coinage uses the right word. Here’s what I wrote in reply to him:

I have to say something about my decision to write of palaiphobia rather than archephobia. It was an agonizing one, I assure you. In these matters I take my bearings primarily from New Testament Greek, as you know, and of course there’s considerable overlap between the two words. When Paul writes in 2 Corinthians 5 that the old has gone and the new is here, he uses ἀρχαῖα; but when he talks about the “old leaven” in 1 Corinthians 5 he uses παλαιὰν ζύμην. As far as I can tell ἀρχαῖα and παλαιὰν would be interchangeable in those contexts. Both words can be neutral in their valence. But if you look at the overall patterns of usage, it seems that there’s something more generally disparaging about παλαιὸς, whereas there’s at least a potential dignity in ἀρχαῖα. When Paul talks, as he often does, about the “old man” that we must put off, he says παλαιὸς ἄνθρωπος. And παλαιὸς often has the connotation of something worn out, as when Jesus talks (in Matthew 9:16) about patching old clothes, ἱματίῳ παλαιῷ. I wanted to capture that disparagement in my coinage, the sense that the past is worn out, useless, of no value. I wonder if you think that makes sense? 


UPDATE 2: after a conversation with Robin Sloan:

an intractable problem

For anyone who thinks about the relations between the past and the present, intellectual maturity consists in understanding that (a) the world is always getting better in some ways and worse in other ways, and that (b) we have no reliable calculus by which we might assign a single composite value to those changes.

If Then

There are many books that I admire and love that I never for a moment dream I could have written. Right now I’m reading an old favorite, Susanna Clarke’s Jonathan Strange and Mr. Norrell, to my wife, and as I do so I am every minute aware that I could not write this book if you gave me a million years in which to do so.

But every now and then I encounter an admirable book that I wish I had written, that (if I squint just right) I see that I could have written. I had that experience a few years ago with Alexandra Harris’s Weatherland, and I’m having it again right now as I read Jill Lepore’s If Then. What a magnificent narrative. What a brilliant evocation of a moment in American history that is in one sense long gone and in another sense a complete anticipation of our own moment. Oh, the envy! (Especially since if I had written the book it wouldn’t be nearly as good as it is.)

We are afflicted by our ignorance of history in multiple ways, and one of my great themes for the past few years has been the damage that our presentism does to our ability to make political and moral judgments. It damages us in multiple ways. One of them, and this is the theme of my book Breaking Bread with the Dead, is that it makes us agitated and angry. When we, day by day and hour by hour, turn a direhose of distortion and misinformation directly into our own faces, we lose the ability to make measured judgments. We lash out against those we perceive to be our enemies and celebrate with an equally unreasonable passion those we deem to be our allies. We lack the tranquility and the “personal density” needed to make wise and balanced judgments about our fellow citizens and about the challenges we face.

But there is another and still simpler problem with our presentism: we have no idea whether we have been through anything like what we are currently going through. Some years ago I wrote about how comprehensively the great moral panic of the 1980s – the belief held by tens of millions of Americans that the childcare centers of America were run by Satan worshipers who sexually abused their charges – has been flushed down the memory hole. In this case, I think the amnesia has happened because a true reckoning with the situation would tell us so much about ourselves that we don’t want to know. It would teach us how credulous we are, and how when faced with lurid stories we lose our ability to make the most elementary factual and evidentiary discriminations. But of course our studied refusal to remember that particular event simply makes us more vulnerable to such panics today, especially given our unprecedentedly widespread self-induced exposure to misinformation.

Even more serious, perhaps, is our ignorance – in this case not so obviously motivated but the product rather of casual neglect — of the violent upheavals that rocked this nation in the 1960s and 1970s. Politicians and pastors and podcasters and bloggers can confidently assert that we are experiencing unprecedented levels of social mistrust and unrest, having conveniently allowed themselves to remain ignorant of what this country was like fifty years ago. (And let’s leave aside the Civil War altogether, since that happened in a prehistoric era.) Rick Perlstein is very good on this point, as I noted in this post.

All of this brings us back to Jill Lepore’s tale of the rise and fall of a company called Simulmatics, and the rise and rise and rise, in the subsequent half-century, of what Simulmatics was created to bring into being. Everything that our current boosters of digital technology claim for their machines was claimed by their predecessors sixty years ago. The worries that we currently have about the power of technocratic overlords began to be uttered in congressional hearings and in the pages of magazines and newspapers fifty years ago. Postwar technophilia, Cold War terror, technological solutionism, racial unrest, counterculture liberationism, and free-market libertarianism — these are the key ingredients of the mixture in which our current moment has been brewed.

Let me wrap up this post with three quotations from If Then that deserve a great deal of reflection. The first comes from early in the book:

The Cold War altered the history of knowledge by distorting the aims and ends of American universities. This began in 1947, with the passage of the National Security Act, which established the Joint Chiefs of Staff, created the Central Intelligence Agency and the National Security Council, and turned the War Department into what would soon be called the Department of Defense, on the back of the belief that defending the nation’s security required massive, unprecedented military spending in peacetime. The Defense Department’s research and development budget skyrocketed. Most of that money went to research universities — the modern research university was built by the federal government — and the rest went to think tanks, including RAND, the institute of the future. There would be new planes, new bombs, and new missiles. And there would be new tools of psychological warfare: the behavioral science of mass communications.

The second quotation describes the influence of Ithiel de Sola Pool — perhaps the central figure in If Then, one of the inventors of behavioral data science and a man dedicated to using that data to fight the Cold War, win elections, predict and forestall race riots by black people, and end communism in Vietnam — on that hero of the counterculture Stewart Brand:

Few people read Pool’s words more avidly than Stewart Brand. “With each passing year the value of this 1983 book becomes more evident,” he wrote. Pool died at the age of sixty-six in the Orwellian year of 1984, the year Apple launched its first Macintosh, the year MIT was establishing a new lab, the Media Lab. Two years later, Brand moved to Cambridge to take a job at the Media Lab, a six-story, $45 million building designed by I. M. Pei and named after Jerome Wiesner, a building that represented nothing so much as a newer version of Buckminster Fuller’s geodesic dome. Brand didn’t so much conduct research at the Media Lab as promote its agenda, as in his best-selling 1987 book, The Media Lab: Inventing the Future at M.I.T. The entire book, Brand said, bore the influence of Ithiel de Sola Pool, especially his Technologies of Freedom. “His book was the single most helpful text in preparing the book you’re reading and the one I would most recommend for following up issues raised here,” Brand wrote. “His interpretations of what’s really going on with the new communications technologies are the best in print.” Brand cited Pool’s work on page after page after page, treating him as the Media Lab’s founding father, which he was.

And finally: One of Lepore’s recurrent themes is the fervent commitment of the Simulmatics crew to a worldview in which nothing in the past matters, and that all we need is to study the Now in order to predict and control the Future:

Behavioral data science presented itself as if it had sprung out of nowhere or as if, like Athena, it had sprung from the head of Zeus. The method Ed Greenfield dubbed “simulmatics” in 1959 was rebranded a half century later as “predictive analytics,” a field with a market size of $4.6 billion in 2017, expected to grow to $12.4 billion by 2022. It was as if Simulmatics’ scientists, first called the “What-If Men” in 1961, had never existed, as if they represented not the past but the future. “Data without what-if modeling may be the database community’s past,” according to a 2011 journal article, “but data with what-if modeling must be its future.” A 2018 encyclopedia defined “what-if analysis” as “a data-intensive simulation,” describing it as “a relatively recent discipline.” What if, what if, what if: What if the future forgets its past?

Which of course it has. Which of course it must, else it loses its raison d’être. Thus the people who most desperately need to read Lepore’s book almost certainly never will. It’s hard to imagine a better case for the distinctive intellectual disciplines of the humanities than the one Lepore made just by writing If Then. But how to get people to confront that case who are debarred by their core convictions from taking it seriously, from even considering it?

the Great Crumping revisited

A surprising number of readers of my previous post have written out of concern for my state of mind, which is kind of them, but I think they have read as a cri de coeur what was meant as a simple summary of the facts. Not pleasant facts, I freely admit, but surely uncontroversial ones. Stating them so bluntly is just one element of my current period of reflection.

The primary reason I am not in despair is simply this: I know some history. I think we will probably see, in the coming decades, the dramatic reduction or elimination of humanities requirements and the closure of whole humanities departments in many American universities, but that will not mean the death of the humanities. Humane learning, literature, art, music have all thrived in places where they were altogether without institutional support. Indeed, I have suggested that it is in times of the breaking of institutions that poetry becomes as necessary as bread.

Similarly, while attendance at Episcopalian and other Anglican churches has been dropping at a steep rate for decades, and I expect will in my lifetime dwindle to nearly nothing, there will still be people worshipping with the Book of Common Prayer as long as … well, as long as there are people, I think. And if evangelicalism completely collapses as a movement — for what it’s worth, I think it already has — that will simply mean a return to an earlier state of affairs. The various flagship institutions of American evangelicalism are (in their current form at least) about as old as I am. The collapse I speak of is, or will be, simply a return to a status quo ante bellum, the bellum in question being World War II, more or less. And goodness, it’s not as if even the Great Awakening had the kind of impact on its culture, all things demographically considered, as one might suspect from its name and from its place in historians’ imaginations.

This doesn’t mean I don’t regret the collapse of the institutions that have helped to sustain me throughout my adult life. I do, very much. And I will do my best to help them survive, if in somewhat constrained and diminished form. But Put not your trust in institutions, as no wise man has even quite said, even as you work to sustain them. It’s what (or Who) stands behind those institutions and gives them their purpose that I believe in and ultimately trust.

The Great Crumping is going on all around me. But if there’s one thing that as a Christian and a student of history I know, it’s this: Crump happens.

Proudhon to Marx

Lyon, 17 May 1846:

Let us seek together, if you wish, the laws of society, the manner in which these laws are realized, the process by which we shall succeed in discovering them; but, for God’s sake, after having demolished all the a priori dogmatisms, do not let us in our turn dream of indoctrinating the people; do not let us fall into the contradiction of your compatriot Martin Luther, who, having overthrown Catholic theology, at once set about, with excommunication and anathema, the foundation of a Protestant theology. For the last three centuries Germany has been mainly occupied in undoing Luther’s shoddy work; do not let us leave humanity with a similar mess to clear up as a result of our efforts. I applaud with all my heart your thought of bringing all opinions to light; let us carry on a good and loyal polemic; let us give the world an example of learned and far-sighted tolerance, but let us not, merely because we are at the head of a movement, make ourselves the leaders of a new intolerance, let us not pose as the apostles of a new religion, even if it be the religion of logic, the religion of reason. Let us gather together and encourage all protests, let us brand all exclusiveness, all mysticism; let us never regard a question as exhausted, and when we have used our last argument, let us begin again, if need be, with eloquence and irony. On that condition, I will gladly enter your association. Otherwise — no!

Emphases mine. Edmund Wilson: “The result of this incident was that Marx was to set upon Proudhon’s new book with a ferocity entirely inconsonant with the opinion of the value of Proudhon’s earlier work which he had expressed and which he was to reiterate later” (To the Finland Station, p. 154).

readings

Gary Dorrien:

Here is where Temple still matters as a theorist of guild socialism. In the early 1940s, both before and after he became Archbishop of Canterbury, Temple got very specific about how to democratize economic power. He was incredulous that modern democracies tolerated big private banks, lamented that Christian socialists turned away in the 1890s from the land issue, and proposed a new form of guild socialism. The banks, he argued, should be turned into utilities or socialized; otherwise the rich controlled the process of investment. God made the land for everyone, and society creates the unearned increment in the value of land; therefore the increment should go to society. Above all, though Temple took for granted that certain natural monopolies must be nationalized, the centerpiece of his proposal was an excess-profits tax payable in the form of shares to worker funds. These funds, over time, would gain democratic control over enterprises. Economic democracy, he argued, can be achieved gradually, peaceably, and on decentralized terms, without abolishing economic markets or making heroic demands on the political system.


Randall Kennedy:

The ultimatum complains that, in its view, past initiatives aimed at enlarging the number of faculty of color at Princeton have “failed” because in 2019–20 “among 814 faculty, there were 30 Black, 31 Latinx, and 0 Indigenous persons. That’s 7%.” According to the ultimatum, this “is not progress by any standard; it falls woefully short of U.S. demographics as estimated by the U.S. Census Bureau, which reports Black and Hispanic persons at 32% of the total population.”

The suggestion that these statistics show racial unfairness in hiring at Princeton is misleading. According to the Journal of Blacks in Higher Education, African Americans in recent years earned only around 7 percent of all doctoral degrees. In engineering it was around 4 percent. In physics around 2 percent. Care must be taken to look for talent in places other than the familiar haunts of Ivy League searches. But even when such care is taken, the resultant catch is almost invariably quite small.

The reasons behind the small numbers are familiar and heart-breaking. They include a legacy of deprivation in education, housing, employment, and health care, not to mention increased vulnerability to crime and incarceration. The perpetuation of injuries from past discrimination as well as the imposition of new wrongs cut like scythes into the ranks of racial minorities, cruelly winnowing the number who are even in the running to teach at Princeton.

The racial demographics of its faculty does not reflect a situation in which the university is putting a thumb on the scale against racial-minority candidates. To the contrary, the university is rightly putting a thumb on the scale in favor of racial-minority candidates. That the numbers remain small reflects the terrible social problems that hinder so many racial minorities before they even have a fighting chance to enter into the elite competitions from which Princeton selects its instructors. The ultimatum denies or minimizes this pipeline problem.


Peter Brown:

Many of Ambrose’s contemporaries were quietly convinced that the ills of Roman society had a supernatural origin. Many of the sharpest critics of their age were not Christians; they were pagans. For them, bad times had begun with the “national apostasy” of Constantine. The rampant avarice denounced by pagan authors was thought to go hand in hand with the spoliation of the temples and the abandonment of the old religion.

Ambrose had to answer such views. He did so by subtly secularizing the contemporary discourse on decline. He turned what many thinking persons considered a religious crisis into a crisis of social relations. We moderns tend to applaud Ambrose for the perspicacity of his diagnosis of the weaknesses of Roman society. But pagans such as Symmachus would have regarded Ambrose’s criticisms of society as mere whistling in the dark. Symmachus knew why things had gone wrong. The moment that the first fruits of the fields of Italy that had fed the Vestal Virgins for 1,200 years were withdrawn (in 382), the link between the land and the gods was broken.

advice for journalists

Andrew Sullivan writes,

Online is increasingly where people live. My average screen time this past week was close to ten hours a day. Yes, a lot of that is work-related. But the idea that I have any real conscious life outside this virtual portal is delusional. And if you live in such a madhouse all the time, you will become mad. You don’t go down a rabbit-hole; your mind increasingly is the rabbit hole — rewired that way by algorithmic practice. And you cannot get out, unless you fight the algorithms to a draw, or manage to exert superhuman discipline and end social media use altogether. […]

In the past, we might have turned to more reliable media for context and perspective. But the journalists and reporters and editors who are supposed to perform this function are human as well. And they are perhaps the ones most trapped in the social media hellscape. You can read them on Twitter, where they live and and posture and rank themselves, or on their Slack channels, where they gang up on and smear any waverers. They’ve created an insulated world where any small dissent from groupthink is professional death. Watch Fox, CNN or MSNBC, and it’s the same story.

Point out missing facts or context, exercise some independence of judgment, push back against the narrative — and you’ll be first subject to ostracism and denunciation by your newsroom peers, and then, if you persist, you’ll be fired. The press could have been the antidote to the social media trap. Instead they chose to become the profitable pusher of the poison.

This is precisely and tragically correct.

I immediately wrote to Andrew to tell him that he needs my new book, stat. But even Andrew, who writes on a weekly basis, who has stepped back from the moment-by-moment insanity of journalistic Twitter (and from the hour-by-hour insanity of the old Dish), probably doesn’t have time to step back a bit further still over the next few weeks and read some old books.

Or doesn’t believe he has time. Maybe, and maybe for journalists more than for anyone else, this is in fact the perfect, the ideal, the necessary moment to recover “real conscious life outside this virtual portal.” One might begin with the epistles of Horace, a man who in exile from Rome learned to love the countryside. Just a thought.

lessons for activists from Edmund Burke

We do not draw the moral lessons we might from history. On the contrary, without care it may be used to vitiate our minds and to destroy our happiness. In history a great volume is unrolled for our instruction, drawing the materials of future wisdom from the past errors and infirmities of mankind…. History consists, for the greater part, of the miseries brought upon the world by pride, ambition, avarice, revenge, lust, sedition, hypocrisy, ungoverned zeal, and all the train of disorderly appetites, which shake the public with the same “troublous storms that toss / The private state, and render life unsweet.” These vices are the causes of those storms. Religion, morals, laws, prerogatives, privileges, liberties, rights of men, are the pretexts. The pretexts are always found in some specious appearance of a real good…. As these are the pretexts, so the ordinary actors and instruments in great public evils are kings, priests, magistrates, senates, parliaments, national assemblies, judges, and captains. You would not cure the evil by resolving, that there should be no more monarchs, nor ministers of state, nor of the gospel; no interpreters of law; no general officers; no public councils. You might change the names. The things in some shape must remain. A certain quantum of power must always exist in the community, in some hands, and under some appellation. Wise men will apply their remedies to vices, not to names; to the causes of evil which are permanent, not to the occasional organs by which they act, and the transitory modes in which they appear. Otherwise you will be wise historically, a fool in practice.

— Burke, Reflections on the Revolution in France (1790)

Mammon

Eugene McCarraher’s The Enchantments of Mammon is a remarkable book, beautifully written and narrated with great verve. However, I do not find its argument persuasive. You can only find the argument persuasive if you accept the book’s two key axioms, and I don’t. I say “axioms” because McCarraher doesn’t argue either of them, he takes them as … well, axiomatic.

The first axiom is that the Middle Ages in Europe, especially western Europe, constituted a great coherent sacramental order — McCarraher loves the word “sacramental” and uses it to designate pretty much everything he approves of — a “moral economy,” a “theological economy,” in which “preparation for [God’s] kingdom was the point of economic life.” Even the vast waves of rebellion and protest that occurred throughout the Middle Ages weren’t protests against that order as such, only complaints that the political and ecclesial authorities weren’t properly fulfilling their duties. McCarraher cites rebellions (e.g. Wat Tyler’s) of which this could plausibly be said as characteristic, and ignores all the more radical protests. (See Chapter 3 of Norman Cohn’s The Pursuit of the Millennium for examples.) Similarly, he presents the political-eschatological vision of Piers Plowman as though it’s typical of its age, when it isn’t typical at all, any more than William Blake’s poems are typical of England circa 1800. A fair look at what we know suggests that the Middle Ages was as governed by greed and lust and the libido dominandi as any other era, and if God had spoken to the rulers of that age he would have asked them, “What mean ye that ye beat my people to pieces, and grind the faces of the poor?” I think the great preponderance of the evidence suggests that McCarraher’s portrait of the Middle Ages is a pious fiction.

The second governing axiom of The Enchantments of Mammon is that modern capitalism, which was, he says, created by Protestantism, sacralizes wealth in a way that no previous system did. But I don’t think that’s true. In all times and all places that I know of, wealth is generally perceived as a sign of divine favor. The peculiarity of the Axial Age is that in that era there arise, in various places around the world, philosophical and theological traditions suggesting that human flourishing cannot be measured in money and possessions. (After all, one of the key features of the land that Yahweh promises to the wandering Israelites is that it flows with milk and honey, is a place in which the people can prosper economically. It’s only when we get to the book of Proverbs that we discern a serious ambivalence about all this. Proverbs 22:4: “The reward for humility and fear of the Lord is riches and honor and life.” But also Proverbs 11:4: “Riches do not profit in the day of wrath, but righteousness delivers from death.”)

At one point early on McCarraher writes, “Talking about consumerism is a way of not talking about capitalism.” This is true. However, it could equally well be said that talking about capitalism is a way of not talking about money. “The love of money is the root of all evil” was written long, long before the rise of the capitalist order that McCarraher denounces. Moreover, talking about money can be a way of not talking about human acquisitiveness, about the sin of greed. The tension between acquisitiveness and a dawning awareness that acquisitiveness isn’t everything goes as far back as the Odyssey, the great predecessor (or maybe great initiator) of the Axial Age sensibility, the first major work of art against greed, which suggests that only tragic suffering — like that of Odysseus, exiled and imprisoned far from his homeland, or Menelaus, whose family is destroyed as he gains godlike riches — is sufficient to break the hold of greed upon us. It seems to me that McCarraher is blaming capitalism for something that is a permanent feature of human life.

Anyway: If you believe that the Middle Ages were a veritable paradise of sacramental order and that the sacralizing of greed is an invention of modern capitalism — which is to say, if you believe that pretty much everything bad in the world is the result of Protestantism — then you may well find McCarraher’s case compelling. But even if you don’t accept those axioms, it’s a terrific read full of fascinating anecdotes. I’ll be thinking about it for a long time.

lessons from the past

Once I saw how things were going I got away from Twitter and stopped checking my RSS reader. I didn’t want to hear any news, mainly because I knew that there wouldn’t be much news as such — though there would be vast tracts of reason-bereft, emotionally-incontinent opinionating. And I don’t need that. Ever.

Instead, I’ve just read Ron Chernow‘s massive and quite excellent biography of George Washington. It’s been a useful as well as an enjoyable experience. For one thing, it reminds me what actual leadership looks like; and for another, it reminds me of how deeply flawed even the best leaders can be, and how profoundly wrong. As is always the case when I spend some serious time with the past, I get perspective. I see the good and the bad of my own time with more clarity and accuracy. And if anything vital has happened over the past few days while I’ve been reading, I’ve got plenty of time to catch up. As I’ve said before, I prefer to take my news a week at a time anyway, not minute by minute.

One theme in Chernow’s biography particularly sticks with me. It concerns the end of Washington’s second administration and his brief period as an ex-President (he lived only two-and-a-half years after departing the office). That was a time when when political parties in something like the modern sense of the term — though many of the Founders referred to the power and danger of “Factions” — dramatically strengthened. It was also a time when journalists who supported one faction would say pretty much anything to discredit the other one, would make up any sort of tale. John Adams, violently angered by all the lies told about him and his colleagues — and there really were lies, outrageous lies, about him, just as there were about Washington near the end of his second term — strongly supported and oversaw the passage of the Alien and Sedition Acts to punish journalistic bias and fake news, as well as to deter immigration — though Adams and his fellow in-power Federalists were only concerned to punish the lies that worked against their policies, not the lies that worked in their favor. (Plenty of those circulated also.) Washington, rather surprisingly and disappointingly, thought these laws good ones.

I draw certain lessons from this whole sordid history. Habitual dishonesty, on the part of politicians and journalists alike, inflames partisanship. You exaggerate the nastiness of your political opponents, and that leads people in your camp to think of the other side not as fellow citizens with whom you disagree on policy, but rather something close to moral monsters. Gratified by your increasingly tight bond with the like-minded, you stretch your exaggerations into outright lies, which gain even more rapturous agreement within your Ingroup and more incandescent anger against the Outgroup. Dishonesty begets partisanship, and partisanship begets further dishonesty. It’s a classic vicious circle.

And one more phenomenon is begotten by all this malice. Wheeled around in this accelerating circle of your own devising, you become incapable of comprehending, or even seeing, something that it is absolutely necessary for every wise social actor to know: In some situations all parties act badly. Sometimes there are no good guys, just fools and knaves, blinded to the true character of their own behavior by their preening partisan self-righteousness.

Lucy Ellmann and old books

We’re at the copy-editing stage of my next book, so it’s too late to add anything, but goodness, I wish I could squeeze in this from Lucy Ellmann:

Some time ago I pretty much decided to read only books written before the atom bomb was dropped, when everything changed for all life on Earth. The industrial revolution’s bad enough, but nuclear weapons really are party-poopers.

I don’t stick strictly to this policy, but I often find it more rewarding to read what people thought about, and what they did with literature, before we were reduced by war and capitalism to mere monetary units, bomb fodder and password generators. And before the natural world became a depository for plastics and nuclear waste.

Anger and alienation have resulted, and they’re fine subjects, but there are times when you’d like to remember some of the higher points in the history of civilisation as well, and the natural world before we learned to view it all as tainted. The intense humour, innocence, sexiness and play of The Life and Opinions of Tristram Shandy, Gentleman, for instance – could this have been written after Hiroshima? Could Gargantua and Pantagruel? Don Quixote? Emma? I don’t see how. Thanks to the offences of patriarchy, a lot of the fun has gone out of being human, and I like books that look at life in less constricted ways.

I love this statement because of its provocations — provocations that should be assessed with some care. For one thing, I don’t see that the variety of books has especially diminished since Hiroshima — I mean, Chicken Soup for the Soul and The Shack and a whole bunch of adult coloring books have appeared since the end of that war — and if we’re missing the distinctive (and very funny) kind of “sexiness” in Tristram Shandy I think that has a lot more to do with the sexual revolution and its unanticipated consequences than with the atom bomb. 

But the idea that before the industrial revolution, with its accompanying “war and capitalism” and reduction of persons to “mere monetary units,” the natural world was perceived in a radically different way — that’s promising. Though I think the main thing that should be said about pre-modern nature is not that it was untainted but that it was scary as shit. Which is just as much worthy of our interest and reflection. 

We could debate such matters all day — and should! Because there’s no question that the past really is another country, though they don’t everything different there. Trying to understand the continuities as well as the discontinuities is what reading books from the past is all about. 

Anyway, my book is going to be great on all that stuff. Make sure it’s the only book published in 2020 that you buy in 2020, okay? Otherwise, stick with the old stuff. You’ll be glad you did. 

Automata, Animal-Machines, and Us

What follows is a review of The Restless Clock: A History of the Centuries-Long Argument over What Makes Living Things Tick, by Jessica Riskin (University of Chicago Press, 2016). The review appeared in the sadly short-lived online journal Education and Culture, and has disappeared from the web without having been saved by the Wayback Machine. So I’m reposting it here. My thanks to that paragon of editors John Wilson for having commissioned it. 

1.

The last few decades have seen, in a wide range of disciplines, a strenuous rethinking of what the material world is all about, and especially what within it has agency. For proponents of the “Gaia hypothesis,” the whole world is a single self-regulating system, a sort of mega-organism with its own distinctive character. By contrast, Richard Dawkins dramatically shifted the landscape of evolutionary biology in his 1976 book The Selfish Gene (and later in The Extended Phenotype of 1982) by arguing that agency happens not at the level of the organism but at the level of the gene: an organism is a device by means of which genes replicate themselves.

Meanwhile, in other intellectual arenas, proponents of the interdisciplinary movement known as “object-oriented ontology” (OOO) and the movement typically linked with it or seen as its predecessor, “actor-network theory” (ANT), want to reconsider a contrast that underlines most of our thinking about the world we live in. That contrast is between humans, who act, and the rest of the known cosmos, which either behaves or is merely passive, acted upon. Proponents of OOO and ANT tend to doubt whether humans really are unique “actors,” but they don’t spend a lot of time trying to refute the assumption. Instead, they try to see what the world looks like if we just don’t make it.

In very general terms we may say that ANT wants to see everything as an actor and OOO wants to see everything as having agency. (The terms are related but, I think, not identical.) So when Bruno Latour, the leading figure in ANT, describes a seventeenth-century scene in which Robert Hooke demonstrates the working of a vacuum pump before the gathered worthies of the Royal Society, he sees Hooke as an actor within a network of power and knowledge. But so is the King, who granted to the Society a royal charter. And so is the laboratory, a particularly complex creation comprised of persons and things, that generates certain types of behavior and discourages or wholly prevents others. So even is the vacuum itself — indeed it is the status of the vacuum as actor that the whole scene revolves around.

For the object-oriented ontologist, similarly, the old line that “to a man with a hammer, everything looks like a nail” is true, but not primarily because of certain human traits, but rather because the hammer wants to pound nails. For the proponent of OOO, there are no good reasons for believing that the statement “This man wants to use his hammer to pound nails” makes any more sense than “This hammer wants to pound nails.”

Some thinkers are completely convinced by this account of the agency of things; others believe it’s nonsense. But very few on either side know that the very debates they’re conducting have been at the heart of Western thought for five hundred years. Indeed, much of the intellectual energy of the modern era has been devoted to figuring out whether the non-human world is alive or dead, active or passive, full of agency or wholly without it. Jessica Riskin’s extraordinary book The Restless Clock tells that vital and almost wholly neglected story.

It is a wildly ambitious book, and even its 500 pages are probably half as many as a thorough telling of its story would require. (The second half of the book, covering events since the Romantic era, seems particularly rushed.) But a much longer book would have struggled to find a readership, and this is a book that needs to be read by people with a wide range of interests and disciplinary allegiances. Riskin and her editors made the right choice to condense the exceptionally complex story, which I will not even try to summarize here; the task would be impossible. I can do little more than point to some of the book’s highlights and suggest some of its argument’s many implications.

2.

Riskin’s story focuses wholly on philosophers — including thinkers that today we would call “scientists” but earlier were known as “natural philosophers” — but the issues she explores have been perceived, and their implications considered, throughout society. For this reason a wonderful companion to The Restless Clock is the 1983 book by Keith Thomas, Man and the Natural World: A History of the Modern Sensibility 1500–1800, one of the most illuminating works of social history I have ever read. In that book, Thomas cites a passage from Fabulous Histories Designed for the Instruction of Children (1788), an enormously popular treatise by the English educational reformer Sarah Trimmer:

‘I have,’ said a lady who was present, ‘been for a long time accustomed to consider animals as mere machines, actuated by the unerring hand of Providence, to do those things which are necessary for the preservation of themselves and their offspring; but the sight of the Learned Pig, which has lately been shewn in London, has deranged these ideas and I know not what to think.’

This lady was not the only one so accustomed, or so perplexed; and much of the story Riskin has to tell is summarized, in an odd way, in this statement. First of all, there is the prevalence in the early modern and Enlightenment eras of automata: complex machines designed to counterfeit biological organisms that then provided many people the vocabulary they felt they needed to explain those organisms. Thus the widespread belief that a dog makes a noise when you kick it in precisely the same way for for precisely the same reasons that a horn makes a noise when you blow into it. These are “mere machines.” (And yes, it occurred to more than a few people that one could extend the logic to humans, who also make noises when kicked. The belief in animals as automata was widespread but by no means universal.)

The second point to be extracted from Trimmer’s anecdote is the lady’s belief that these natural automata are “actuated by the unerring hand of Providence” — that their efficient working is a testimony to the Intelligent Design of the world.

And the third point is that phenomena may be brought to public attention — animals trained to do what we had thought possible only by humans, or automata whose workings are especially inscrutable, like the famous chess-playing Mechanical Turk — that call into question the basic assumptions that separate the world into useful categories: the human and the non-human, the animal and all that lies “below” the animal, the animate and the inanimate. Maybe none of these categories are stable after all.

These three points generate puzzlement not only for society ladies, but also (and perhaps even more) for philosophers. This is what The Restless Clock is about.

3.

One way to think of this book — Riskin herself does not make so strong a claim, but I think it warranted — is as a re-narration of the philosophical history of modernity as a series of attempts to reckon with the increasing sophistication of machines. Philosophy on this account becomes an accidental by-product of engineering. Consider, for instance, an argument made in Diderot’s philosophical dialogue D’Alembert’s Dream, as summarized by Riskin:

During the conversation, “Diderot” introduces “d’Alembert” to such principles as the idea that there is no essential difference between a canary and a bird-automaton, other than complexity of organization and degree of sensitivity. Indeed, the nerves of a human being, even those of a philosopher, are but “sensitive vibrating strings,” so that the difference between “the philosopher-instrument and the clavichord-instrument” is just the greater sensitivity of the philosopher-instrument and its ability to play itself. A philosopher is essentially a keenly sensitive, self-playing clavichord.

But why does Diderot even begin to think in such terms? Largely because of the enormous notoriety of Jacques de Vaucanson, the brilliant designer and builder of various automata. Vaucanson is best known today for his wondrous mechanical duck, which quacked, snuffled its snout in water, flapped its wings, ate food placed before it, and — most astonishing of all to the general populace — defecated what it had consumed. (Though in fact Vaucanson had, ahead of any given exhibition of the duck’s prowess, placed some appropriately textured material in the cabinet on which the duck stood so the automaton could be made to defecate on command.) Voltaire famously wrote that without “Vaucanson’s duck, you would have nothing to remind you of the glory of France,” but he genuinely admired the engineer, as did almost everyone who encountered his projects, the most impressive of which, aside perhaps from the duck, were human-shaped automata — androids, as they were called, thanks to a coinage by the seventeenth-century librarian Gabrial Naudé — one of which played a flute, the other a pipe and drum. These devices, and the awe they produced upon observers, led quite directly to Diderot’s speculations about the philosopher as a less harmonious variety of clavichord.

And if machines could prompt philosophy, do they not have theological implications as well? Indeed they do, though, if certain stories are to be believed, the machine/theology relationship can be rather tense. In the process of coining the term “android,” Naudé relates (with commendable skepticism) the claim of a 15th-century writer that Albertus Magnus, the great medieval bishop and theologian, had built a metal man. This automaton answered any question put to it and even, some said, dictated to Albertus hundreds of pages of theology he later claimed as his own. But the mechanical theologian met a sad end when one of Albertus’s students grew exasperated by “its great babbling and chattering” and smashed it to pieces. This student’s name was Thomas Aquinas.

The story is far too good to be true, though its potential uses are so many and varied that I am going to try to believe it. The image of Thomas, the apostle of human thought and of the limits of human thought, who wrote the greatest body of theology ever composed and then at the end of his life dismissed it all as “straw,” smashing this simulacrum of philosophy, this Meccano idol — this is too perfect an exemplum not to reward our contemplation. By ending the android’s “babbling and chattering” and replacing it with patient, careful, and rigorous dialectical disputation, Thomas restored human beings to their rightful place atop the visible part of the Great Chain of Being, and refuted, before they even arose, Diderot’s claims that humans are just immensely sophisticated machines.

Yet one of the more fascinating elements of Riskin’s narrative is the revelation that the fully-worked-out idea of the human being as a kind of machine was introduced, and became commonplace, late in the 17th century by thinkers who employed it as an aid to Christian apologetics — as a way of proving that we and all of Creation are, as Sarah Trimmer’s puzzled lady put it, “actuated by the unerring hand of Providence.” Thus Riskin:

“Man is a meer piece of Mechanism,” wrote the English doctor and polemicist William Coward in 1702, “a curious Frame of Clock-Work, and only a Reasoning Engine.” To any potential critic of such a view, Coward added, “I must remind my adversary, that Man is such a curious piece of Mechanism, as shews only an Almighty Power could be the first and sole Artificer, viz., to make a Reasoning Engine out of dead matter, a Lump of Insensible Earth to live, to be able to Discourse, to pry and search into the very nature of Heaven and Earth.”

Since Max Weber in the 19th century it has been a commonplace that Protestant theology “disenchants” the world, purging the animistic cosmos of medieval Catholicism of its panoply of energetic spirits, good, evil, and ambiguous. But Riskin demonstrates convincingly that this purging was done in the name of a sovereign God in whom all spiritual power was believed to dwell. This is not simply a story of the rise of a materialist science that triumphed over and marginalized religion; rather, the world seen as a “passive mechanical artifact world relied upon a supernatural, divine intelligence. It was inseparably and in equal parts a scientific and a theological model.” So when Richard Dawkins wrote, in 2006,

People who think that robots are by definition more ‘deterministic’ than human beings are muddled (unless they are religious, in which case they might consistently hold that humans have some divine gift of free will denied to mere machines). If, like most of the critics of my ‘lumbering robot’ passage, you are not religious, then face up to the following question. What on earth do you think you are, if not a robot, albeit a very complicated one?

— he had no idea that he was echoing the argument of an early-18th-century apologist for Protestant Christianity.

Again, the mechanist position was by no means universally held, and Riskin gives attention throughout to figures who either rejected this model or modified it in interesting ways: here the German philosopher Leibniz (1646–1716) is a particularly fascinating figure, in that he admired the desire to preserve and celebrate the sovereignty of God even as he doubted that a mechanistic model of the cosmos was the way to do it. But the mechanistic model which drained agency from the world, except (perhaps) for human beings, eventually carried the day.

One of the most important sections of The Restless Clock comes near the end, where Riskin demonstrates (and laments) the consequences of modern scientists’ ignorance of the history she relates. For instance, almost all evolutionary theorists today deride Jean-Baptiste Lamarck (1744–1829), even though Lamarck was one of the first major figures to argue that evolutionary change happens throughout the world of living things, because he believed in a causal mechanism that Darwin would later reject: the ability of creatures to transmit acquired traits to their offspring. Richard Dawkins calls Lamarck a “miracle-monger,” but Lamarck didn’t believe he was doing any such thing: rather, by attributing to living things a vital power to alter themselves from within, he was actually making it possible to explain evolutionary change without (as the astronomer Laplace put it in a different context) having recourse to the hypothesis of God. The real challenge for Lamarck and other thinkers of the Romantic era, Riskin argues, was this:

How to revoke the monopoly on agency that mechanist science had assigned to God? How to bring the inanimate, clockwork cosmos of classical mechanist science back to life while remaining as faithful as possible to the core principles of the scientific tradition? A whole movement of poets, physiologists, novelists, chemists, philosophers and experimental physicists — roles often combined in the same person — struggled with this question. Their struggles brought the natural machinery of contemporary science from inanimate to dead to alive once more. The dead matter of the Romantics became animate, not at the hands of an external Designer, but through the action of a vital agency, an organic power, an all-embracing energy intrinsic to nature’s machinery.

This “vitalist” tradition was one with which Charles Darwin struggled, in ways that are often puzzling to his strongest proponents today because they do not know this tradition. They think that Darwin was exhibiting some kind of post-religious hangover when he adopted, or considered adopting, Lamarckian ideas, but, Riskin convincingly demonstrates, “insofar as Darwin adopted Lamarck’s forces of change, he did so not out of a failure of nerve or an inability to carry his own revolution all the way, but on the contrary because he too sought a rigorously naturalist theory and was determined to avoid the mechanist solution of externalizing purpose and agency to a supernatural god.”

Similarly, certain 20th-century intellectual trends, most notably the cybernetics movement (associated primarily with Norbert Wiener) and the behaviorism of B. F. Skinner, developed ideas about behavior and agency that their proponents believed no one had dared think before, when in fact those very ideas had been widely debated for the previous four hundred years. Such thinkers were either, like Skinner, proudly ignorant of what had gone before them or, like Wiener and in a different way Richard Dawkins, reliant on a history that is effectively, as Riskin puts it, “upside-down.” (Riskin has a fascinating section on the concept of “robot” that sheds much light on the Dawkins claim about human robots quoted above.)

4.

At both the beginning and end of her book, Riskin mentions a conversation with a friend of hers, a biologist, who agreed “that biologists continually attribute agency — intentions, desires, will — to the objects they study (for example cells, molecules),” but denied that this kind of language signifies anything in particular: it “was only a manner of speaking, a kind of placeholder that biologists use to stand in for explanations they can’t yet give.” When I read this anecdote, I was immediately reminded that Richard Dawkins says the same in a footnote to the 30th anniversary edition of The Selfish Gene:

This strategic way of talking about an animal or plant, or a gene, as if it were consciously working out how best to increase its success … has become commonplace among working biologists. It is a language of convenience which is harmless unless it happens to fall into the hands of those ill-equipped to understand it.

Riskin, who misses very little that is relevant to her vast argument, cites this very passage. The question which Dawkins waves aside so contemptuously, but which Riskin rightly believes vital, is this: Why do biologists find that particular language so convenient? Why is it so easy for them to fall into the “merely linguistic” attribution of agency even when they so strenuously refuse agency to the biological world? We might also return to the movements I mention at the outset of this review and ask why the ideas of OOO and ANT seem so evidently right to many people, and so evidently wrong to others.

The body of powerful, influential, but now utterly neglected ideas explored in The Restless Clock might illuminate for many scientists and many philosophers a few of their pervasive blind spots. “By recognizing the historical roots of the almost unanimous conviction (in principle if not in practice) that science must not attribute any kind of agency to natural phenomena, and recuperating the historical presence of a competing tradition, we can measure the limits of current scientific discussion, and possibly, think beyond them.”

terror and history

This excellent post by my colleague Philip Jenkins reminds us of an earlier era — just 25 years ago! — when America was worried about right-wing terrorists. As I have often pointed out — see here and here — it’s not just the distant past we’ve forgotten, it’s the very recent past. And that forgetfulness makes it very difficult for us to come up with appropriate and proportionate responses to our current problems.

chronological snobbery

The novelist Hannah Beckerman was asked, “I’m an English lit postgraduate who’s slipped into a reading rut since my final exams – what are some good books to get me back into loving literature?” Here are the first publication dates of the books she recommended:

  • 2006
  • 2016
  • 2013
  • 2010
  • 2014
  • 2015
  • 2015
  • 1995
  • 1997
  • 2000
  • 2002
  • 2017
  • 1959–1994 (Paley’s stories)
  • 1937
  • 2017
  • 2019
  • 1988
  • 1926
  • 1989
  • 1999
  • 2015

Also, all of them are written in English and by people from England, Scotland, Ireland, and the USA. The two major temporal outliers (1937 and 1926) are both children’s books, and there are no adults-only (or -primarily) novels from before 2002.

Is it really likely that all the books that might be recommended to someone who wants to “get … back into loving literature” are from our culture, our language, our time? And that none of them are poems or plays?

a road not taken

Lately I have been reading some of the wartime letters of Dorothy Sayers — who, I have just learned, pronounced her name to rhyme with “stairs” — and have been constantly reminded of something that I wrote about a bit in my Year of Our Lord 1943: the complex network, centered of course in London, of Christians working outside of standard ecclesiastical channels to bring a vibrant Christian faith before the minds of the people of England in the midst of war. People like J. H. Oldham and Philip Mairet and, perhaps above all, James Welch of the BBC — who convinced Dorothy Sayers to write the radio plays that came to be called The Man Born to be King, recruited C. S. Lewis to give the broadcast talks that became Mere Christianity, and commissioned music from Ralph Vaughan Williams — ended up having an impact on the public face of English Christianity that was enormous but is now almost completely unknown.

At one point in researching my book I thought seriously about throwing out my plans and writing this story instead — but I couldn’t bear to let go of the fascinating interplay between ideas being articulated in England and their close siblings arising in the U.S., especially in New York City.

I can’t remember whether I’ve mentioned it here before — a quick search suggests not — but I have long dreamed of writing a book called Christian London: a history of the distinctive and often profoundly influential role that London has played in the history of Christianity. However, no one I have spoken to about this project — my agent, various editors, friends — has shared my enthusiasm. I might write it one day anyhow, and if I do, people like Oldham and Mairet and Welch will be major characters in one chapter.

the most literary decade

Popular radio shows featured literary critics talking about recent poetry and fiction. The New Yorker’s book-review editor hosted one of the most popular radio shows in America, and his anthology, Reading I’ve Liked, ranked seventh on the bestseller list for nonfiction in 1941. At the Democratic Primary Convention in 1948, F. O. Matthiessen, a professor of American literature, delivered a nominating speech for Henry Wallace, the late FDR’s vice president. Writers were celebrities. Literature was popular. The 1940s was the most intensely literary decade in American history, perhaps in world history. Books symbolized freedom.

Posters of 1942 quoted the president: “Books cannot be killed by fire. People die, but books never die. No man and no force can put thought in a concentration camp forever. No man and no force can take from the world the books that embody man’s eternal fight against tyranny. In this war, we know, books are weapons.” During the Blitz, Muriel Rukeyser recalled, “newspapers in America carried full-page advertisements for The Oxford Book of English Verse, announced as ‘all that is imperishable of England.’ ” For the first and only time in history, protecting books in war zones became an official aim of armed forces.

— George Hutchinson, Facing the Abyss: American Literature and Culture in the 1940s. Compare the argument I made in my essay “The Watchmen.”

the greatest of the Wedgwoods

CVW
National Portrait Gallery

Cicely Veronica Wedgwood (1910–1997) was the most distinguished of the Wedgwoods — with the possible exception of her great-great-great grandfather Josiah. In the estimation of Anthony Grafton — himself one of the most distinguished historians of our time — she is “the greatest narrative historian of the twentieth century.” And that counts for a lot, in my book.

By the adjective “narrative” Grafton means to distinguish Wedgwood’s way of writing history from what has become the standard academic one, which is less concerned with telling a story than providing an analytical and often data-driven framework for understanding historical events and patterns. That model has become sufficiently dominant that most of us these days think of the writing of history as something very different than the writing of “literature,” but ’twas not always so. When Gibbon wrote his great Decline and Fall of the Roman Empire he was understood to be just as literary as Alexander Pope or Henry Fielding (probably more so than Fielding). In the next century Lord Macaulay’s History of England was every bit as literary as his Lays of Ancient Rome. George Steiner, writing decades ago, understood that Wedgwood was working in this tradition, and celebrates it even as he denounces what has replaced it:

The ambitions of scientific rigour and prophecy have seduced much historical writing from its veritable nature, which is art. Much of what passes for history at present is scarcely literate…. The illusion of science and the fashions of the academic tend to transform the young historian into a ferret gnawing at the minute fact or figure. He dwells in footnotes and writes monographs in as illiterate a style as possible to demonstrate the scientific bias of his craft. One of the few contemporary historians prepared to defend openly the poetic nature of all historical imagining is C. V. Wedgwood. She fully concedes that all style brings with it the possibility of distortion: ‘There is no literary style which may not at some point take away something from the ascertainable outline of truth, which it is the task of scholarship to excavate and re-establish.’ But where such excavation abandons style altogether, or harbours the illusion of impartial exactitude, it will light only on dust.

“The poetic nature of all historical imagining” — preach it, sir! But few historians today, even those rare birds who even make an effort to tell a good story, can hold a candle to Wedgwood. Peter Ackroyd, for instance, who has shifted from being primarily a novelist to being primarily a historian, presumably because of his skills at narration, is a mechanical plodder in comparison to Wedgwood. (And he’s getting worse. His ongoing history of England is coma-inducing.)

Wedgwood didn’t describe what she did as more literary or artful than the work of academic historians, even though, of course, it is. Here’s how she understood her work:

The application of modern methods of research, together with modern knowledge and prejudice, can make the past merely the subject of our own analytical ingenuity or our own illusions. With scholarly precision we can build up theories as to why and how things happened which are convincing to us, which may even be true, but which those who live through the epoch would neither recognize nor accept. It is legitimate for the historian to pierce the surface and bring to light to motives and influences not known at the time; but it is equally legitimate to accept the motives and explanations which satisfied contemporaries…. This book is not a defence of one side [in the English Civil War] or the other, not an economic analysis, not a social study; it is an attempt to understand how these men felt and why, in their own estimation, they acted as they did.

That from the Introduction to The King’s Peace, the first volume of her history of the fall of King Charles I. This interest in representing people and events by employing categories that they themselves would have recognized may be seen in this brilliant brief portrait of Charles:

He had never had the painful experience from which his father, as a young man, had learned so much; he had never confronted insolent opponents face-to-face and had the worst of the argument. No national danger had compelled him to go out among his people and share their perils. He was, at this time, not only the most formal but the most remote and sheltered of all European kings.

Less virtuous monarchs escaped from formality in the arms of lowborn mistresses, but for the chaste Charles, no Nell Gwynne, prattling cockney anecdotes, opened a window into the lives of his humbler subjects. Like many shy, meticulous men, he was fond of aphorisms, and would write in the margins of books, in a delicate, beautiful, deliberate script, such maxims as “Few great talkers are great doers” or “None but cowards are cruel.” He trusted more to such distilled and bottled essence of other man’s wisdom than to his own experience, which was, in truth, limited; his daily contact with the world was confined within the artificial circle of his Court and the hunting field. He was to say, much later, in tragic circumstances, that he knew as much law as any gentleman in England. It was true; but he had little conception of what the laws meant to those who lived under them.

That last sentence is simply devastating. (Whenever Wedgood pauses in her narrative for a character sketch of one of her actors, you are always in for something superb.)

Wedgwood pauses from time to time in The King’s Peace, especially in its early pages, to sketch not personalities but social orders and patterns and beliefs, including the experiences of those very people whom Charles never knew. Here’s a sample:

In the wilds of Lochaber from time to time a green man could be seen, one that had been killed between daylight and starlight and belonged neither to earth nor heaven. Some thought these apparitions were only the unhappy dead but others thought them one of the many forms taken by the devil. The strong forces of nature, with the advent of Christianity, had become confused with the devil, and after the lapse of centuries witchcraft had become indivisibly compact of pagan and Christian beliefs. The devil, in many forms, bestrode the islands from end to end. Sometimes he was “a proper gentleman with a laced band,” as when he came to Elizabeth Clarke at Chelmsford; at other times you might know him, as Rebecca Jones of St. Osyth did, by his great glaring eyes. He was cold and sensual and rather mean: he offered Priscilla Collit of Dunwich only ten shillings for her immortal soul; she gave it to him and off he went without paying. Respectably dressed in “brown clothes and a little black hat,” he spoke in friendly terms to Margaret Duchill of Alloa in Scotland; “Maggie, will you be my servant?” he asked, and when she agreed he told her to call him John and gave her five shillings and powers of life and death over her neighbours.

I could quote paragraph after paragraph, page after page, of this kind of thing. And her ability to immerse you in a battle, or a court scene, or a back-room political argument, is unexcelled. Moreover, all this is based (as Grafton notes) on exceptionally thorough archival research, sometimes in multiple languages. I am writing a book that attempts to convince people that the past can be, and should be, a living world to us. That case would be much easier to make if there were more historians like C. V. Wedgwood.

She should be remembered as one of the best storytellers writing in English in the twentieth century. And yet her major works — with the sole exception of the early but masterful history of the Thirty Years War — are out of print. That is a travesty.

Editors! Publishers! Let’s redress this injustice. If you need someone to edit and/or write an introduction to a new edition of Wedgwood’s work, just call on me. I am here for you.

And readers! You can of course start with The Thirty Years War, which is as good as advertised. But I prefer her work on English history. For those, AbeBooks is your friend. If you’re not sure you want to commit to one of the big books, her third volume on the English Civil War, A Coffin for King Charles, works perfectly well as an independent story and comes in at around 250 pages. It is absolutely riveting.

this is your mind on presentism

As a person writing a book about the need to cultivate temporal bandwidth, I am so pleased when various prominent cultural outlets do advance publicity on my behalf. Consider for instance this piece in the New Yorker on the decline in the study of history:

“Yes, we have a responsibility to train for the world of employment, but are we educating for life, and without historical knowledge you are not ready for life,” Blight told me. As our political discourse is increasingly dominated by sources who care nothing for truth or credibility, we come closer and closer to the situation that Walter Lippmann warned about a century ago, in his seminal “Liberty and the News.” “Men who have lost their grip upon the relevant facts of their environment are the inevitable victims of agitation and propaganda. The quack, the charlatan, the jingo … can flourish only where the audience is deprived of independent access to information,” he wrote. A nation whose citizens have no knowledge of history is asking to be led by quacks, charlatans, and jingos. As he has proved ever since he rode to political prominence on the lie of Barack Obama’s birthplace, Trump is all three. And, without more history majors, we are doomed to repeat him.

I would give a big Amen to this but with one caveat: it’s not more history majors we need, it’s a more general, more widespread, acquaintance with history. Without that we are fully at the mercy of our now-habitual and increasingly tyrannical presentism.

Consider, as an exemplum, this Farhad Manjoo column, in which he deplores the “prison” of being referred to by gendered pronouns. Damon Linker’s response zeroes in on a key point:

But what is this freedom that Manjoo and so many others suddenly crave for themselves and their children? That’s more than a little mysterious. Slaves everywhere presumably know that they are unfree, even if they accept the legitimacy of the system and the master that keeps them enslaved. But what is this bondage we couldn’t even begin to perceive in 2009 that in under a decade has become a burden so onerous that it produces a demand for the overturning of well-settled rules and assumptions, some of which (“the gender binary”) go all the way back to the earliest origins of human civilization?

I think Linker could have, with equal appositeness, referred to 2014: If you got in a time machine and showed the Farhad Manjoo of 2014 a copy of his 2019 column, he almost certainly would not believe that he had written it. A stance that in 2014 was been so uncontroversial that it didn’t rise to the level of consciousness — that it’s okay for us to refer to ourselves by gendered pronouns — is now the unmistakable sign of “a ubiquitous prison for the mind.” And yet so thoroughly is Manjoo immersed in the imperatives of the moment that he’s not even aware of the discontinuity. That is the real prison for the mind.

grass

James Madison

Last week when I was in Virginia I got to visit James Madison’s home Montpelier. Madison has long been my favorite of the Founders, but during my visit I realized that I had never read a complete biography of him. I have now remedied that by reading Richard Brookhiser’s concise and vigorous narrative, and I am moved to contemplate the extraordinary success that Madison had at guiding groups of politicians towards his preferred ends. Though he spent eight years as President and, before that, eight years as Jefferson’s Secretary of State, he belonged by temperament and character to the legislative rather than the executive branch. He was an unprepossessing figure, at just over five feet tall and a hundred pounds, and had a weak voice, but no greater committee man has ever lived. In a later era he would surely have been the greatest of American Senators.

It seems to me that there are three traits that, in combination, set Madison apart from his contemporaries and from almost every leading political figure before since.

First, he simply worked harder than anyone else. When he was chosen a delegate to the Constitutional Convention he arrived several days early to scope out the area and make relevant connections; each day of the convention — and unlike many other delegates who came and went, some of whom took lengthy vacations from the proceedings when the weather got hot, he was there every damned day — he arrived early, got a choice seat and then took incredibly extensive shorthand notes to document every single thing that happened in each of those meetings. (He would even check with other delegates to make sure that he had taken down their words accurately. This gave him a well-earned reputation for scrupulousness, which he later made good use of: he would always quote with absolute faithfulness from his notes — but was also shrewdly selective in what he chose to share. )

Second, Madison made himself the best informed person at every meeting. Even people who hated Madison acknowledged that he always had more information at his disposal than anyone else. Long before before the Convention began he wrote to Jefferson, who was in Paris, to ask him for books on government and political history. Jefferson sent two hundred volumes, which Madison devoted months to reading, annotating, and sifting. This was simply characteristic.

Third, he didn’t care who got credit. Madison was happy to let other people stand up to make noble speeches on behalf of some cause that he advocated, and to receive great applause — as long as he determined the content of those speeches.

These are all lessons worth learning, it seems to me.

Finally, I was taken by this passage from the end of Brookhiser’s biography:

Madison lies in the family cemetery, a five-minute walk from the front door of Montpelier; the graveyard was more convenient to the original house on the property, which the Madisons vacated when he was a boy. His grave is in a corner of the plot, marked by an obelisk; the shaft surmounts a blocky base, simply inscribed MADISON, along with his dates.

When it was first shown to me, I learned that the stone was not contemporary with his burial, but had been put up in 1857, twenty-one years later. What was his original marker, I asked. There was none, I was told; your marker was your family plot. Your dead relatives indicated who you were, and your living ones would remember where you were.

futurists and historians

Martin E. P. Seligman and John Tierney:

What best distinguishes our species is an ability that scientists are just beginning to appreciate: We contemplate the future. Our singular foresight created civilization and sustains society. It usually lifts our spirits, but it’s also the source of most depression and anxiety, whether we’re evaluating our own lives or worrying about the nation. Other animals have springtime rituals for educating the young, but only we subject them to “commencement” speeches grandly informing them that today is the first day of the rest of their lives.

A more apt name for our species would be Homo prospectus, because we thrive by considering our prospects. The power of prospection is what makes us wise. Looking into the future, consciously and unconsciously, is a central function of our large brain, as psychologists and neuroscientists have discovered — rather belatedly, because for the past century most researchers have assumed that we’re prisoners of the past and the present.

I wonder what evidence exists for the claim that “What best distinguishes our species is an ability that scientists are just beginning to appreciate: We contemplate the future.” What if we are more clever and resourceful readers of the past than other species? What if it was our singular power of retrospection that “created civilization and sustains society”? After all, while it’s true that we homo sapiens alone give commencement speeches, it’s also true that we homo sapiens alone build things like the Lincoln Memorial and inter our distinguished dead in places like Westminster Abbey. Why should the former count for more than the latter? 

In the preface to his translation of Thucydides (1629), Thomas Hobbes wrote that “the principal and proper work of history [is] to instruct and enable men, by the knowledge of actions past, to bear themselves prudently in the present and providently towards the future.” That is to say, validity of prospection depends upon accuracy of retrospection. Those who do not understand the past will not prepare themselves well for the future. Even if they read fizzy opinion pieces in the New York Times

Roberts’s Churchill

All my adult life I have had a strong appetite for books about Winston Churchill. It began, I suppose, when I read the first volume of William Manchester’s biography, which was frankly hagiographical but vividly told. I have read much since then, including much work highly critical of the man. I can readily understand people disliking or even hating Churchill; I could never understand someone who doesn’t find him fascinating. So, unsurprisingly, I have very much looked forward to the new biography by Andrew Roberts.

I haven’t finished it yet — so much else to do right now — but I think it’s fair to say that like Manchester’s book it is a very strong narrative, and like Manchester’s, it is largely hagiographical. I have been particularly struck by Roberts’s approach to Churchill’s eventful experience in the Great War: his chief principle seems to be that Churchill’s judgment may be questioned or even condemned, but that his character must be vindicated at all costs. Roberts accepts ungrudgingly Churchill’s many mistakes in advocating for and the overseeing the Gallipoli campaign, but he will not countenance the suggestion that Churchill fell into those mistakes because of his character flaws. Personality flaws, perhaps — impetuousness, for instance — but not flaws of moral character.

Consider this passage:

Lloyd George needed no persuasion to throw over his old friend and ally. The price of the Conservatives joining a national government was that Churchill should be sent to a sinecure post with no executive portfolio attached. ‘It is the Nemesis of the man who has fought for this war for years,’ Lloyd George told Frances Stevenson that day. ‘When the war came he saw in it the chance of glory for himself, and has accordingly entered on a risky campaign without caring a straw for the misery and hardship it would bring to thousands, in the hope that he would be the outstanding man in this war.’ There was bitterness and jealousy in that remark, but little factual accuracy.

But Roberts quotes Churchill on many occasions not only expressing excitement for the war, but demonstrating constant awareness of how his performance in it could pave the way for him to become Prime Minister. He told Violet Asquith at a dinner party in early 1915, “I know a curse should rest on me – because I love this war. I know it’s smashing and shattering the lives of thousands every moment – and yet – I can’t help it – I enjoy every second of it.” A man who could not only think this but say it out loud is a man who might legitimately be suspected of pursuing “a risky campaign without caring a straw for the misery and hardship it would bring to thousands.” He said himself that that suffering didn’t disrupt his delight in the war even for a second.

Similarly, Roberts quotes the Prince of Wales — the future King Edward VIII and, after his abdication, Duke of Windsor — saying, “It is a great relief to know that Winston is leaving the Admiralty … one does feel that he launches the country on mad ventures which are fearfully expensive both as regards men and munitions and which don’t attain their object.” To which Roberts comments, “It was for this feckless young man that Churchill would later nearly sacrifice his career.” No doubt the Prince was feckless, and would become still more so, but to describe Gallipoli as a “fearfully expensive” venture “both as regards men and munitions” that did not attain its object is to state the simple, inescapable truth.

One more example of Roberts’s habitual attitude: when the government of the Prime Minister Herbert Asquith was on the verge of collapse, and some kind of coalition had to be formed, the other parties to the coalition made it clear that they would not participate unless Churchill were sacked as First Lord of the Admiralty. Churchill wrote to Asquith to plead that he be kept on anyway, to which Asquith replied, “You must take it as settled that you are not to remain at the Admiralty … I hope to retain your services as a member of the new Cabinet, being, as I am, sincerely grateful for the splendid work you have done both before and since the war.” Roberts’s comment on that letter: “Asquith could treat Churchill harshly partly because the First Lord had so few supporters.” Treat him harshly? In no circumstances could Asquith have kept Churchill on — that had been made perfectly clear to him — so he is refusing to hold out any false hope, but nevertheless offering to keep Churchill in his Cabinet, which the other members of the coalition did not want. Asquith could do that much for Churchill only by spending some valuable political capital, and it would not have been “harsh” had Asquith banished Churchill from the government altogether.

Churchill’s attitude towards the loss of his position? “My conviction that the greatest of my work is still to be done is strong within me: and I ride reposefully along the gale. The hour of Asquith’s punishment and K[itchener]’s exposure draws nearer. The wretched men have nearly wrecked our chances. It may fall to me to strike the blow. I shall do it without compunction.”

Again, Roberts admits that Churchill makes mistakes; but he prefers to identify those mistakes himself, and when he quotes any criticism of Churchill from contemporaries, he tends to (a) dismiss it and (b) indicate his low opinion of the character of the critic. And in all this he faithfully reflects Churchill’s own interpretation of his role in the Great War.

As for me, I am inclined to agree with the journalist A. G. Gardiner, who wrote that Churchill “is always unconsciously playing a part – an heroic part. And he is himself his most astonished spectator.”

Plutarch and the end of the oracles

In my History of Disenchantment class, we’ve been discussing Plutarch’s essay on the cessation or the silence or the failure of the oracles. (The key word there, ekleloipoton, seems to be an odd one — I’m trying to learn more about it.) I read it long ago, but this is the first time I’ve taught it, and goodness, what a fascinating piece of work.

It was widely recognized in Plutarch’s time (late first and early second century A.D.) that the great oracles of the ancient world — the most famous of them being the one at Delphi, of course — had largely ceased to provide useful guidance or had fallen silent altogether. Some of the once famous shrines had been abandoned and had fallen into ruin. But no one understood why this had happened. Plutarch’s “essay” is a fictional dialogue — narrated by one Lamprias, who also takes the leading role in the conversation and may well be Plutarch’s mouthpiece — in which a group of philosophically-inclined men debate the possible reasons for the oracles’ failure.

In the opening pages of the dialogue, some of the participants deny that there is any real problem. They point to the inaccessibility of some of the shrines, and the lack of population in the surrounding areas: for them the issue is merely one of low demand leading to low supply. But this view is not widely accepted; most of the philosophers are uneasy about the oracles and feel that something is up. And after all, if low demand leads to low supply, why is there low demand? Even if the oracles are located in remote places, surely people would take the trouble to make a pilgrimage there if they believed that by doing so they could receive wise guidance for their lives.

To one of the participants, the answer to the whole problem is obvious: The gods are angry at us for our wickedness and have punitively withdrawn their guidance. But, perhaps surprisingly, no one finds this a compelling explanation. For one thing, it’s not clear to them there was any less wickedness in the earlier eras when the oracles flourished; and more to the point, are oracles given by the gods in the first place?

If they are, then shouldn’t they continue forever, since the gods themselves are immortal, unless they are specifically withdrawn? Not necessarily, says one: “the gods do not die, but their gifts do” — a line he says is from Sophocles, though I don’t know its source. Maybe the oracles lived their natural course and have now fallen silent, as one day we all will.

But what if oracles do not come from the gods, but rather from daimons? In that case the oracles might die because the daimons do. This leads to a long discussion about whether daimons are mortal, and if mortal or not whether they are necessarily good. (The one truly famous passage from this essay — in which someone recounts a story about a sailor instructed to pass by an island and cry out “The great god Pan is dead” — assumes that Pan was not a god but rather a daimon, the son of Hermes by Odysseus’s famously loyal wife Penelope.)

And this in turn leads to a very long conversation about the beings that populate the world and whether there might be other worlds populated differently and, now that we think about it, how many worlds are there anyway? (The most popular answer among the discussants: 185.) As I told my students, this is by modern standards a bizarre digression, especially since it takes up about half the dialogue, but our standards were not those of Plutarch’s time; and in any case the discussants might plausibly say that we can’t come up with a reliable solution to the puzzle of the silenced oracles unless we have a good general understanding of the kind of cosmos we live in.

In any event, the discussion eventually circles back around to the initial question, and in the final pages Lamprias gets the chance to develop the argument that he has been hinting at all along. In brief, he contends that oracles are always situated in or near caves because from those caves issue “exhalations of the earth”; and that certain people with natural gifts and excellent training of those gifts may be sensitized to the character of those exhalations, and in that way come to some intuitive and not-easily-verbalized awareness of what the world has in store for people. It’s almost a Gaia hypothesis, this idea that the world as a whole acts in certain fixed ways, and those “exhalations” attest to the more general movements of the planet. But these processes are, like all processes in Nature, subject to change over time. As a spring might dry up, or a river after flooding alter its course, so too the conditions for such exhalations might change so that there is nothing for even the most exquisitely sensitive and perfectly trained priestess to respond to.

The first and overwhelming response to Lamprias’s explanation is: Impiety! One of the interlocutors comments that first we rejected the gods in favor of daimons, and now we’re rejecting daimons in favor of a purely natural process. That is, Lamprias’s position is fundamentally disenchanting. To this Lamprias replies that his position is not impious at all, because they had all agreed earlier that in addition to humans and daimons and gods, none of whom create anything, we also have, abobe and beyond all, The God, “the Lord and Father of All,” and He is he first cause of all things, including exhalations of the earth and priestesses.

But whether it’s impious or not, Lamprias’s account is disenchanting, because it removes power from spirits and gods and concentrates them in a single transcendent Monad. His monotheism is a big step towards the religion of Israel, which tells us in the very first words of its Scriptures that the sun and moon and stars are not deities at all, but rather things made by YHWH, who alone merits our worship. Lamprias’s position, like that of the Jews, looks to those accustomed to polytheism as a kind of atheism. And by their standards that’s just what it is.

how Steve Reich discovered his own Judaism

I was brought up a secular, Reform Jew, which means I didn’t know Aleph from Bet. I knew nothing, and therefore I cared nothing. My father cared culturally, but that’s all. So when I came home from Africa, I thought to myself, there’s this incredible oral tradition in Ghana, passed on from father to son, mother to daughter, for thousands of years. Don’t I have something like that? I’m a member of the oldest group of human beings still known as a group that managed to cohere enough to survive – and I know nothing about it. So I started studying at Lincoln Square Synagogue in midtown Manhattan, an Orthodox temple, that had an incredible adult-education program for the likes of me – and I asked whether they would teach a course in biblical Hebrew, and they said sure, and they brought a professor down from Yeshiva University to teach that, and I studied the weekly portion – I didn’t even know there was such a thing as a weekly portion and commentaries thereon.

So this whole world opened up for me – it was 1975, at about the same time as I met my wife, Beryl, and so all of this sort of came together and it did occur to me – isn’t it curious that I had to go to Ghana to go back to my own traditions because I think if you understand any historical group, or any other religion for that matter, in any detail, then you’ll be able to approach another one with more understanding. So the answer to your question is yes. The longest yes you’ve ever heard.

Steve Reich

enough with the “Cultural Marxism” already

Alexander Zubatov tries to rescue the term “Cultural Marxism” in this post, and I don’t think he succeeds. Now, he might succeed in defending some users of the term from some of the charges Samuel Moyn makes here, but that’s a different matter (and one I won’t take up here). I simply want to argue that the term ought to be abandoned.

Here’s Zubatov’s definition:

So what is cultural Marxism? In brief, it is a belief that cultural productions (books, institutions, etc.) and ideas are emanations of underlying power structures, so we must scrutinize and judge all culture and ideas based on their relation to power.

The problem here, put as succinctly as I can put it, is that you can take this view of culture without being a Marxist, and you can be a Marxist without taking this view of culture.

Taking the latter point first: in The German Ideology Marx and Engels state quite clearly that there is a relationship between the art that is produced at a given time and place and the overall character of that time and place:

Raphael’s work of art depended on the flourishing of Rome at that time, which occurred under Florentine influence, while the works of Leonardo depended on the state of things in Florence, and the works of Titian, at a later period, depended on the totally different development of Venice. Raphael as much as any other artist was determined by the technical advances in art made before him, by the organization of society and the division of labour in his locality, and, finally, by the division of labour in all countries with which his locality had intercourse. Whether an individual like Raphael succeeds in developing his talent depends wholly on demand, which in turn depends on the division of labour and the conditions of human culture resulting from it.

But notice that there is a lot more going on here than “underlying power structures.” Marx and Engels are making a rather commonsensical point, which is that the history and social organization of Florence meant that the work produced in it would (of course!) be different than the work produced at the same time in a society such as that of Venice. They are implying that those who really want to understand a work of art will make themselves familiar with the wide range of circumstances that form a given culture, only some of which are political or economic. Certain “technical advances in art” — the use of varying pigments, for instance — might arise in a given place less because of the economic conditions than because of a particular artist’s ingenuity.

To be sure, many later Marxists would get highly agitated by Marx and Engels’s use of the word “determined” in the passage just quoted. But a decade after The German Ideology, in an appendix to A Contribution to the Critique of Political Economy, Marx would clarify this point:

As regards art, it is well known that some of its peaks by no means correspond to the general development of society, nor do they therefore to the material substructure, the skeleton as it were of its organisation. For example the Greeks compared with modern [nations] …. It is even acknowledged that certain branches of art, e.g., the epic, can no longer be produced in their epoch-making classic form after artistic production as such has begun; in other words that certain important artistic formations are only possible at an early stage in the development of art itself.

Marx believed that capitalism was an advance over feudalism, which was in turn an advance over more primitive forms of political organization; that did not mean that he thought Benjamin Disraeli a superior writer to Homer — or that you could explain Homer’s greatness by invoking the politics of his world. But if you want to understand a given work of art you need to pay attention to what Marx and Engels habitually called its “material conditions.”

So, if we grant that Marx and Engels are Marxists, we must then conclude that Marxists do not necessarily believe that “cultural productions (books, institutions, etc.) and ideas are emanations of underlying power structures.” And many later Marxists, including some of the ones Zubatov quotes, go even further in separating the superstructure of cultural production from the economic base. (Georg Lukács, for instance, was taken to task by Bertolt Brecht for writing criticism insufficiently attentive to the base and therefore to the revolutionary imperative: “It is the element of capitulation, of withdrawal, of utopian idealism which still lurks in Lukács’s essays and which he will undoubtedly overcome, that makes his work, which otherwise contains so much of value, unsatisfactory; for it gives the impression that what concerns him is enjoyment alone, not struggle, a way of escape, rather than a march forward.”)

Moreover, the one figure who did the most to consolidate the idea that all culture is deeply implicated in the “underlying power structures,” the power-knowledge regime, is Michel Foucault, and there has always been the suspicion on the academic Left that Foucault is actually a conservative.

It is equally clear that one can believe that an advocate “for the persecuted and oppressed must attack forms of culture that reinscribe the values of the ruling class, and disseminate culture and ideas that support ‘oppressed’ groups and ‘progressive’ causes,” without endorsing any of the core principles of Marx’s system. (There are forms of conservatism and Christianity that are as fiercely critical of the ruling class as any Marxist, while having no time for dialectical materialism or communism.)

I am not convinced by Moyn’s claim that there is something strongly antisemitic about the contemporary use of the term “Cultural Marxism” — though I’d be interested to hear him develop that argument at greater length. I tend to see there term deployed in the classic Red Scare mode of the McCarthy era: in some circles, now as then, there’s no quicker and easier way to discredit an idea than to call its proponents Commies. And that’s the work that the “Marxism” half of “cultural Marxism” does.

totalitarian presentism

Senator Ben Sasse doesn’t read modern fiction, only old books, and people on social media are getting seriously freaked out.

Let’s stop and think about this. Sasse’s day job requires him to spend dozens of hours a week immersed in the affairs of the moment. When he turns on his TV: affairs of the moment. When he ferries people around as an Uber driver, he hears about people’s take on the affairs of the moment. When he listens to the radio: affairs of the moment. When he’s on social media: affairs of the moment. But if, in his leisure hours, he wants to read old books, he’s THE WORST. He has committed the unpardonable sin.

It is not enough for many people that they be so utterly presentist in their sensibility that their temporal bandwidth is a nanometer wide. Everyone must share their obsession with the instant. No one may look to other times. It’s not just presentism, it’s totalitarian presentism.

Yeah, I really do need to write this book.

Tim Larsen on John Stuart Mill

My friend Tim Larsen has written an absolutely fascinating brief biography of John Stuart Mill. (It appears in the Oxford Spiritual Lives series, of which Tim is also the editor.) Everyone knows that Mill had little time for or interest in religion, and that his father James Mill, aide-de-camp to that great enemy of faith Jeremy Bentham, was even more hostile than JSM himself. Given JSM’s secular and rationalist upbringing, it cannot be surprising that, as he put it, “I looked upon the modern exactly as I did upon the ancient religion, as something which in no way concerned me. It did not seem to me more strange that English people should believe what I did not, than that the men I read of in Herodotus should have done so.”

However, as Tim shows convincingly, this statement is seriously misleading to the point of being simply untrue. The same must be said for the familiar family story. James Mill was a licensed preacher and had turned to the life of a writer and public intellectual only after failing to get the kind of pastoral position he thought he was qualified for — a fact he carefully hid from his children — and probably didn’t become a complete unbeliever until he was in his mid-forties, by which point JSM was already a ten-year-old, or older, prodigy. James’s wife Harriet was a Christian, each of their nine children was baptized in the Church of England, and probably only two of them (JSM and his youngest sibling George Grote) departed in any significant way from standard-issue Victorian religion. JSM grew up learning not only the Bible but the worship of the Church of England. Nothing about that religion was strange to him.

Moreover, Tim also demonstrates that throughout the course of his life JSM held to a minimal but stable theology, which he summarizes thus:

Even from a scientific point of view and without any intuitive sense or direct experience of the divine, there is still sufficient evidence to warrant the conclusion that God probably exists; God is good, but not omnipotent; the life and sayings of Christ are admirable and deserve our reverence; the immortality of the soul or an afterlife are possible but not certain; humanity’s task is to co-labour with God to subdue evil and make the world a better place; the affirmations in the previous points such as that God exists and is good cannot be proven, but it is still a reasonable act to appropriate these religious convictions imaginatively not he basis of hope.

(One of the most interesting parts of Tim’s book is his exploration of the potent role “hope” played in Mill’s moral and intellectual lexicon.) There is much more than I might comment on, especially Mill’s alliances with evangelical Christians in his campaign for the rights of women — his rationalist friends were generally cold to this idea — and the interestingly varied religious beliefs of his family, but I will just strongly suggest that you read the book. Its subtitle is “A Secular Life,” and JSM’s life was indeed secular, but not in a modern sense. As Tim puts it, in the Victorian era, even for atheists “the sea of faith was full and all around.”

how change happens

There’s a general principle that underlies yesterday’s posts about the Catholic Church’s current problems, and it’s this: The more time people spend on social media, the more prone they become to recency bias, and especially the form of recency bias that inclines us to believe that what has just happened is far more important that it really is.

Everyone everywhere is prone to recency bias, but I think we are more prone to it than any society in history because our media are so attentive to the events of Now, and we are so immersed in those media that anything that happened more than a week or so ago is consigned to the dustbin of history. The big social-media companies function as what I have called the Ministry of Amnesia, and the result is that we lack temporal bandwidth. Unless we work hard to cultivate that temporal bandwidth, we won’t have the “personal density” to resist the amnesia-producing forces that make us think that whatever happens today is more important than anything that has ever happened.

Increasingly, I think, the people who rule our society understand how all this works, and no one understands it better than Donald Trump. Trump knows perfectly well that his audience’s attachment to the immediate is so great that he can make virtually any scandal disappear from the public mind with three or four tweets. And the very journalists who most want to hold Trump accountable are also the most vulnerable to his changing of the subject. He’s got them on a string. They cannot resist the tweets du jour.

This tyranny of the immediate has two major effects on our political judgment. First, it disables us from making accurate assessments of threats and dangers. We may, for instance, think that we live in a time of uniquely poisonous social mistrust and open hostility, but that’s only because we have forgotten what the Sixties and early Seventies were like.

Second, it inclines us to forget that the greatest of social changes tend to happen, as Edward Gibbon put it, insensibly. Even when they seem sudden, it is almost always case that the suddenness is merely a very long gradual transformation finally bearing fruit. There’s a famous moment in Hemingway’s The Sun Also Rises when one character asks another how he went bankrupt. “Two ways,” the man replies. “Gradually and then suddenly.” But the “suddenly” happened because he was previously insensible to the “gradually.” Likewise, events are always appearing to us with extreme suddenness — but only because we are so amnesiac that we have failed to discern the long slow gradual forces that made this moment inevitable.

And so we float on, boats with the current, borne forward ceaselessly into an ever-surprising future.


I write these words on the day following the death of Senator John McCain, and I find myself thinking about one particular moment in McCain’s long and varied career. When he was a prisoner of the North Vietnamese, having been so deprived of food that his weight dropped to 105 pounds, having been beaten regularly, having been thrown into solitary confinement, he one day found himself offered the possibility of freedom. His father had recently been named Commander of the U.S. forces in the Vietnamese theater, and his captors understood that releasing the young man would likely be a public-relations boon for them, an open declaration of their magnanimity. McCain refused — unless the men who had been captured with him were also released with him. This demand went unmet, and McCain not only remained in prison, he was also subjected to intensified beatings and torture so severe that for the rest of his life he would be unable to lift his arms over his head (as was often noted, he always had to have someone around to comb his hair, because he couldn’t). He waited another five years for his release.

All of which leads me to a thought, and a question. Several decades after McCain’s torture a man who had taken great care to avoid serving in the military, and who indeed had never done an hour’s work of public service, denied that McCain was a hero, mocked him for having been captured, cited McCain’s experience as proof that torture “works” — and in spite of such monstrous perversion of spirit was elected President of the United States with the overwhelming support of the very people most likely to say that they “support our troops.” How did this become possible?

The answer is: Gradually and then suddenly. We all saw the “suddenly.” But we have not thought nearly as carefully, as rigorously, as we should about the “gradually.” It’s too late to avert Trump, of course. But we damn well better be asking ourselves what else is happening gradually that will spring upon us with shocking suddenness if we don’t develop more temporal bandwidth and personal density. And do it now.

excerpt from my Sent folder: on exhausted languages

What I really am, by vocation and avocation, is a historian of ideas, and when you’ve been a historian of ideas for several decades you’re bound to notice how a certain vocabulary can take over an era — and not always in a good way. Consider for instance the period of over half the 20th century in which Freudian language completely dominated humanistic discourse, despite the fact that it had no empirical support whatever and was about as wrong-headed as it is possible for a body of ideas to be. Some tiny number of people flatly rejected it, a rather larger group enthused over it, and the great majority accepted it as part of their mandatory mental furniture, like having a coffee table or refrigerator in your house. (“It’s what people do, dear.”) Eventually it passed not because it had been discredited — it had never been “credited” in the first place — but because people got tired of it.

This exhaustion of a vocabulary happens more and more quickly now thanks to the takeover of intellectual life by a university committed to novelty in scholarship. But that’s a topic for another day.

Anyway, when you do this kind of work you develop — or you damn well ought to develop — an awareness that many of our vocabularies are evanescent  because of their highly limited explanatory power. You see, in a given discipline or topic area, one vocabulary coming on as another fades away, and you don’t expect the new one to last any longer than the previous one did. I think this makes it easier for you to consider the possibility that a whole explanatory language is basically useless. But while those languages last people get profoundly attached to them and are simply unwilling to question them — they become axioms for their users — which means that conversations cease to be conversations but rather turn into endlessly iterated restatements of quasi-religious conviction. “Intersecting monologues,” as Rebecca West said.

Often when I’m grading essays, or talking to my students about their essays, I notice that a certain set of terms are functioning axiomatically for them in ways that impede actual thought. When that happens I will sometimes ask, “How would you describe your position if you couldn’t use that word?” And I try to force the same discipline on myself on those occasions (too rare of course) when I realize that I am allowing a certain set of terms to become an intellectual crutch.

Moreover, I have come to believe that when a conversation gets to the “intersecting monologue” stage, when people are just trotting out the same limited set of terms in every context, that says something about the inadequacy of the vocabulary itself. Not just its users but the vocabulary itself is proving resistant to an encounter with difference and otherness. And that’s a sign that it has lost whatever explanatory power it ever had.

I think that’s where we are in our discourse of gender. And that’s why I am strongly inclined to think that there’s nothing substantial behind that discourse, it’s just a bundle of words with no actual explanatory power. And even if that’s not the case, the only way we can free ourselves from bondage to our terministic axioms is to set them aside and try to describe the phenomena we’re interested in in wholly other terms.

This, by the way, is the origin of all great metaphors, the “metaphors we live by”: the ones that make a permanent mark on culture are the ones that arise from an awareness of how our conventional terms fail us. Those coinages are (often desperate) attempts to throw off the constricting power of those terms. It was when Darwin realized that the explanatory language of natural history had reached a dead end that he coined “natural selection,” a term whose power is so great that it is hard for most people to realize that it is after all a metaphor. Our whole discourse of gender needs Darwins who can’t bear those constrictions any more and decide to live without them. And the first term that should go, as I suggested to you earlier, is “gender” itself.

insensibly

Careful writers of narrative, whether that narrative is fictional or historical or journalistic, will, like composers, work with themes and variations on those themes. For example, consider Edward Gibbon: reading his account of Rome’s decline and fall a few years ago, I noticed his fondness for a particular adverb: insensibly. “It was by such institutions that the nations of the empire insensibly melted away into the Roman name and people.” “We have already seen that the active and successful zeal of the Christians had insensibly diffused them through every province and almost every city of the empire.” “The heart of Theodosius was softened by the tears of beauty; his affections were insensibly engaged by the graces of youth and innocence: the art of Justina managed and directed the impulse of passion; and the celebration of the royal nuptials was the assurance and signal of the civil war.”

Why does Gibbon like this word so much? Is it just a verbal quirk? I think not: rather, it embodies a key theme of the whole history, which is that major transformations in the life of the Roman empire happened slowly, gradually, and without anyone noticing them: people were insensible to the changes, and by the time anyone figured out what had happened, it was too late for a reversal of course. And this insensibility affects political structures, social and religious developments, military cultures, and the hearts of emperors alike; this particular theme has many and wide-ranging variations.

The reader who notices this word, then, notices a vital, not a trivial, point about the story Gibbon tells. 

— Me, in The Pleasures of Reading in an Age of Distraction 

ages of revolution

If a man in the fullness of his days, at the end of his life, can pass on the wisdom of his experience to those who grow up after him; if what he has learned in his youth, added to but not discarded in his maturity, still serves him in his old age and is still worth teaching the then young — then his was not an age of revolution, not counting, of course, abortive revolutions. The world into which his children enter is still his world, not because it is entirely unchanged, but because the changes that did occur were gradual and limited enough for him to absorb them into his initial stock and keep abreast of them. If, however, a man in his advancing years has to turn to his children, or grandchildren, to have them tell him what the present is about; if his own acquired knowledge and understanding no longer avail him; if at the end of his days he finds himself to be obsolete rather than wise — then we may term the rate and scope of change that thus overtook him, “revolutionary.”

— Hans Jonas, from Philosophical Essays (1974), p. 46

memory and invention

In her book The Craft of Thought, Mary Carruthers identifies the purpose of memorization in medieval intellectual culture:

The orator’s “art of memory” was not an art of recitation and reiteration but an art of invention, an art that made it possible for a person to act competently within the “arena” of debate (a favorite commonplace), to respond to interruptions and questions, or to dilate upon the ideas that momentarily occurred to him, without becoming hopelessly distracted, or losing his place in the scheme of his basic speech. That was the elementary good of having an “artificial memory.” …

I repeat: the goal of rhetorical mnemotechnical craft was not to give students a prodigious memory for all the information they might be asked to repeat in an examination, but to give an orator the means and wherewithal to invent his material, both beforehand and — crucially — on the spot. Memoria is most usefully thought of as a compositional art. The arts of memory are among the arts of thinking, especially involved with fostering the qualities we now revere as “imagination” and “creativity.”

Today people will tell you that memorization is the enemy of creativity; these medieval thinkers understood that precisely the opposite is true: the more you have memorized the greater will be your powers of invention.

time machine

I longed for the loan of the Time Machine — a contraption with its saddle and quartz bars that was plainly a glorification of the bicycle. What a waste of this magical vehicle to take it prying into the future, as had the hero of the book! The future, dreariest of prospects! Were I in the saddle I should set the engine Slow Astern. To hover gently back through centuries (not more than thirty of them) would be the most exquisite pleasure of which I can conceive.

— Evelyn Waugh, from the first page of A Little Learning

Crowley, Pynchon, and the hippies

In The Solitudes, the first volume of John Crowley’s Aegypt series, the series’ protagonist Pierce Moffett reflects:

He had the idea that not many children had been conceived in the year of his own conception, most potential fathers being then off to war, only those with special disabilities (like Pierce’s own) being left to breed. He was too young to be a beatnik; later, he would find himself too old, and too strictly reared, to be a success as a hippie.

Pierce was born in 1942 — the same year as John Crowley himself, whom he does not in other obvious respects resemble — which means that he would have been a child when the beatniks emerged, but well into adulthood when the Era of the Hippie began. Pierce was therefore born between two possibilities of rebellion against bourgeois conventionality, possibilities at which he could look longingly but into which he could never fully enter.

Thomas Pynchon is five years older than the fictional Pierce Moffett and the nonfictional John Crowley, but in his introduction to Slow Learner, a collection of his early short stories, he describes precisely the same experience. When he returned to university (Cornell) after spending a couple of intermission years in the Navy, he found that he and his classmates “were at a transition point, a strange post-Beat passage of cultural time…. Unfortunately there were no more primary choices for us to make. We were onlookers: the parade had gone by and we were already getting everything secondhand, consumers of what the media of the time were supplying us.“ The Beats had faded, but their commodified leftovers could be picked up at the local five-and-dime — or, by proxy, on TV.

So “when the hippie resurgence came along ten years later,“ Pynchon’s primary feeling was one of “nostalgia”: nostalgia for something he never quite had, and at his age at the time (early 30s) could no longer grasp. He was again an onlooker.

For Tom Wolfe (born 1931) in The Electric Kool-Aid Acid Test, observing the hippies’ world produced a mocking disgust; for Joan Didion (born 1934) in Slouching Towards Bethelehem something more like bemused contempt, leavened by occasional moments of gentle envy. But for Thomas Pynchon and Pierce Moffett and, I think, John Crowley the feeling is more wonderment at a vast world of possibility — possibility for the hippies, but not, alas for those condemned by their age to watch the marvelous parade from the sidelines.

Many readers of both Crowley and Pynchon find their obvious affection for the Sixties hard to swallow, but that’s because we know how it all turned out. The utopian or millenarian hopes of the era — the belief that the Age of Aquarius was being ushered in (Pierce, a historian, keeps telling people that they’re a few hundred years off, though he may be wrong about that) — seem comical in retrospect. Bell-bottom jeans, tie-dyed t-shirts, Wavy Gravy, bathetically pseudo-visionary art by Peter Max…. And whatever elements of it tapped into something genuinely powerful — well, there were people who knew how to commodify that and to do so more thoroughly than those hidden persuaders of the Fifties could have dreamed of. (A point to which I shall return.)

But those bedazzled onlookers like Pynchon and Crowley didn’t know that at the time. It could have worked out differently … could it not?

The great theme of Crowley’s Aegypt is: “There is more than one history of the world.” The past, as well as the future, is a garden of forking paths. Here’s how Pierce explains that theme, about which he hopes to write:

“It’s as though,” he said, “as though there had once upon a time been a wholly different world, which worked in a way we can’t imagine; a complete world, with all its own histories, physical laws, sciences to describe it, etymologies, correspondences. And then came a big change in all of them, a big change, bound up with printing, and the discoveries of Copernicus and Kepler, and the Cartesian and Baconian ideals of mechanistic and experimental science. The new sciences were hugely successful; bit by bit they scrubbed away all the persisting structures of the old science; they even scrubbed away the actually very strange and magical way the world appeared to men like Kepler and Newton and Bruno. The whole old world we once inhabited is like a dream, a dream we forgot on waking, even though, as dreams do, it lingered on into all-awake thinking; and even now it lingers on, all around our world, in our thought, so that every day in little ways, little odd ways, we think like prescientific men, magicians, Pythagoreans, Rosicrucians, without knowing we do so.”

And the emergence of those new sciences changed not only the future — the future they would bequeath to us — but what preceded them. The world of the magicians and astrologers and mystics, of John Dee and Giordano Bruno and Paracelsus and all the disciples of thrice-great Hermes and the wisdom he learned from ancient Aegypt, not only disappeared but became retroactively false. Until the Cartesian moment magic worked and always had; after the Cartesian moment magic didn’t work and never had. The paths fork both ahead and behind.

But if the world can be changed in such a way that the validity of magic is erased from its past as well as its future, might it not possibly change again? Might not the 1960s and 1970s have been another decisive moment, another fork in the path, the initiation of a true novus ordo seclorum?

Ultima Cumaei venit iam carminis aetas;
magnus ab integro saeclorum nascitur ordo:
iam redit et Virgo, redeunt Saturnia regna;
iam nova progenies caelo demittitur alto.

These are the possibilities of that moment as discerned by Crowley and Pynchon, and because they discerned those possibilities then they have remained ever since deeply attracted to the ideals of that era. Had the hippies won, our histories of thought would feature John Dee where they now feature Francis Bacon, and Giordano Bruno where they now feature Descartes; and who knows what the shape of our social order might be?

But the hippies didn’t win. What won instead is the Californian ideology. And if you want to picture the moment when the victory of something genuinely “spiritual” and non-commodified and non-panoptic became impossible, when the fusion of fake spirituality with commerce and governmental control ascended its throne, here you go:

Anglicans and the echo of history 

There are impulses at work in the Church, on both the right and the left, a desire to sweep away the tired old past and to start over again. This desire is founded on an illusory hope. The demands of “justice,” “love,” or “truth” will not sustain the weight pressed upon them as the single interpretive tool to order the Church’s life. These demands cannot trump orthodoxy, or the rich experience of the Church in the past, which is the context of orthodoxy. History is where God works and reveals his will. Those who want to sweep away the mistakes of the past by escape from it are more likely to perpetuate those same mistakes, in the very process of wielding their own theological and pastoral “broom.” The history of the Church is littered with examples; of course, you have to have a commitment to history to notice.

Bishop John Bauerschmidt

Pelphase, Interphase, Gusphase

“Now,” he said, swiveling his head to look at his pupils, “here is how the cycle works.” He marked off three arcs. “We have a Pelagian phase. Then we have an intermediate phase.” His chart thickened one arc, then another. “This leads into an Augustinian phase.” More thickening, and the chalk was back where it had started. “Pelphase, Interphase, Gusphase, Pelphase, Interphase, Gusphase, and so on, forever and ever. A sort of perpetual waltz. We must now consider what motive power makes the wheel turn.… In the first place, let us remind ourselves what Pelagianism stands for. A government functioning in its Pelagian phase commits itself to the belief that man is perfectible, that perfection can be achieved by his own efforts, and that the journey towards perfection is a long straight road. Man wants to be perfect. He wants to be good. The citizens of a community want to co-operate with their rulers, and so there is no real need to have devices of coercion, sanctions, which will force them to co-operate. Laws are necessary, of course, for no single individual, however good and co-operative, can have precise knowledge of the total needs of the community. Laws point the way to an emergent pattern of social perfection – they are guides. But, because of the fundamental thesis that the citizen’s desire is to behave like a good social animal, not like a selfish beast of the waste wood, it is assumed that the laws will be obeyed. Thus, the Pelagian state does not think it necessary to erect an elaborate punitive apparatus. Disobey the law and you will be told not to do it again or fined a couple of crowns. Your failure to obey does not spring from Original Sin, it’s not an essential part of the human fabric. It’s a mere flaw, something that will be shed somewhere along the road to final human perfection.… Well, then, in the Pelagian phase or Pelphase, the great liberal dream seems capable of fulfillment. The sinful aquisitive urge is lacking, brute desires are kept under rational control…. No happier form of existence can be envisaged. Remember, however,” said Tristram, in a thrilling near-whisper, “Remember that the aspiration is always some way ahead of the reality. What destroys the dream? What destroys it, eh?” He suddenly big-drummed the desk, shouting in crescendo, “Disappointment. Disappointment. DISAPPOINTMENT.” He beamed. “The governors,” he said, in a reasonable tone, “become disappointed when they find that men are not as good as they thought they were. Lapped in their dream of perfection, they are horrified when the seal is broken and they see people as they really are. It becomes necessary to try and force the citizens into goodness. The laws are reasserted, a system of enforcement of those laws is crudely and hastily knocked together. Disappointment opens up a vista of chaos. There is irrationality, there is panic. When the reason goes, the brute steps in. Brutality!” cried Tristram. The class was at last interested. “Beatings-up, secret police. Torture in brightly lighted sellers. Condemnation without trial. Finger-nails pulled out with pincers. The rack. The cold water treatment. The gouging-out of eyes. The firing squad in the cold dawn. And all this because of disappointment. The Interphase.” He smiled very kindly at his class. This class was agog for more mention of brutality. Their eyes glinted, they goggled with open mouths.

“What, sir,” ask Bellingham, “is the cold-water treatment?”

 

•••

 

“But,” went on Tristram, “the Interphase cannot, of course, last for ever.”” He contorted his face to a mask of shock. ‘Shock,’ he said. “The governors become shocked at their own excesses. They find that they have been thinking in heretical terms — the sinfulness of man rather than his inherent goodness. They relax their sanctions and the result is complete chaos. But, by this time, disappointment cannot sink any deeper. Disappointment can no longer shock the state into repressive action, and a kind of philosophical pessimism supervenes. In other words, we drift into the Augustinian phase, the Gusphase. The orthodox view presents man as a sinful creature from whom no good at all may be expected. A different dream, gentlemen, a dream which, again, outstrips the reality. It eventually appears that human social behaviour is rather better than any Augustinian pessimist has a right to expect, and so a sort of optimism begins to emerge. And so Pelagianism is reinstated. We are back in the Pelphase again. The wheel has come full circle. Any questions?”

“What do they gouge eyes out with, sir?” asked Billy Chan.


Anthony Burgess, The Wanting Seed

the wrong side

So today I want to speak about why we chose to remove these four monuments to the Lost Cause of the Confederacy, but also how and why this process can move us towards healing and understanding of each other.

So, let’s start with the facts. The historic record is clear. The Robert E. Lee, Jefferson Davis, and P.G.T. Beauregard statues were not erected just to honor these men, but as part of the movement which became known as The Cult of the Lost Cause. This ‘cult’ had one goal—through monuments and through other means—to rewrite history to hide the truth, which is that the Confederacy was on the wrong side of humanity.

Mayor Mitch Landrieu of New Orleans. Not “the wrong side of history,” that fatuous and overused phrase: the wrong side of humanity.

comprehension

The conviction that everything that happens on earth must be comprehensible to man can lead to interpreting history by commonplaces. Comprehension does not mean denying the outrageous, deducing the unprecedented from precedents, or explaining phenomena by such analogies and generalities that the impact of reality and the shock of experience are no longer felt. It means, rather, examining and bearing consciously the burden which our century has placed on us — neither denying its existence nor submitting meekly to its weight. Comprehension, in short, means the unpremeditated, attentive facing up to, and resisting of, reality — whatever it may be.

— Hannah Arendt, The Origins of Totalitarianism


The city of Liège, Belgium in 1914, under siege, and in 2009. When, in August 1914, the German armies decided to cut through Belgium on their way to France, thereby mocking the treaty Germany had signed to protect the integrity of Belgium, they expected no resistance. They got resistance.

Their immediate response was to burn a nearby village to the ground, shoot a good many random civilians, and round up Catholic priests and execute them on the totally spurious, invented-on-the-spot grounds that they were spies working against Germany. All that happened on the first day of the German invasion. When those actions did not lead to a Belgian surrender, the Germans sent zeppelins over the city to terror-bomb it into submission.

The notion that the Great War was fought in a relatively civilized way, and that it was only World War II that introduced terror against civilians as an essential element of military strategy, is belied by these first events of the earlier conflict. The German army didn’t continue in this vein simply because it couldn’t: it had limited access to major population centers for the rest of the war. But from the moment the war began the Germans were prepared to destroy anyone who stood in their war, without regard to any other consideration, no matter how time-honored or essential to humanity.

Colossal head of Serapis

britishmuseum:

image

This head depicting the Greco-Egyptian god Serapis is from a colossal statue that stood over 4 metres tall. The statue is thought to be from the Temple of Serapis, a huge sanctuary measuring 101 metres by 78 metres, which once stood in the ancient Egyptian city of Canopus. The impressive remains of this sanctuary were recently discovered by underwater archaeologists led by Franck Goddio.

In this statue, Serapis wears his characteristic headdress, a corn measure known as a kalathos, symbolising abundance and fertility. Alongside his funerary and royal roles, Serapis was worshipped for his healing powers, which according to ancient historians were particularly potent in Canopus. People came from afar to sleep within the temple complex in order to be healed by ‘incubation’, when miraculous cures were delivered in a dream.

The god Serapis is said to have been introduced to Egypt by the ancient Greek ruler Ptolemy I. Serapis was aimed at Greeks living in Egypt and his worship developed where Greek presence was prevailing, notably in Alexandria and Canopus. The popularity of this universal god also flourished outside Egypt in the Greek Mediterranean world, then later in the Roman Empire.

Colossal head of Serapis. Canopus, c. 200 BC. On loan from Maritime Museum, Alexandria. Photo: Christoph Gerigk.
© Franck Goddio/Hilti Foundation.

On August 2, 1100, the English king William Rufus was killed by an arrow while hunting in the New Forest, probably in an assassination, possibly by a genuine accident. We really don’t know. In modern times, though, that story developed an unexpected afterlife through the work of a bizarre scholar called Margaret A.  Murray (1863-1963). Murray was a distinguished Egyptologist, who developed a grand unified theory of European witchcraft. She argued that the records of witch-trials were not simply fictitious, but actually contained accounts of genuine underground pagan cults that flourished within a notionally Christian Europe.

This theory was not wholly new to her, and she had plenty of predecessors over the previous decades, including feminists like Matilda Joslyn Gage. However, Murray brought the idea to a mass audience. That theory was expressed in Murray’s enormously influential 1921 book The Witch-Cult in Western Europe, and in The God of the Witches (1931). These books inspired countless horror novels, and Murray’s writings are even cited in H. P. Lovecraft’s Call of Cthulhu. They also largely inspired the actual creation of the Wicca movement, the supposedly revived witchcraft created by Gerald Gardner in the 1950s. Notionally, that too was a revival of an ancient pagan cult.

Dark Majesty and Folk Horror. A fantastic post by my colleague Philip Jenkins, with a follow-up here. Philip should write a book about these matters, the bogus academic roots of neopaganism.

The Gospel of Life

Today is the Feast of Saints Basil & Gregory Nazianzen, which reminds me that some years ago I wanted to write a book for the church that took the Cappadocians as our models. I never got beyond the proposal, in part because the editors I talked to weren’t enthusiastic; my idea seemed neither fish nor fowl, too churchy for an intellectual audience and too intellectual for a churchy audience. So I set the idea aside and am not likely, now, to return to it. In lieu of that I’m posting the proposal here.


Heroes. One of my heroes is Paul Farmer. Farmer is a physician who teaches at Harvard, co-edits a journal called Health and Human Rights, and leads an organization he founded called Partners in Health. That last role leads him to spend much of his time in Haiti, Rwanda, and other parts of the world where health care for the poor has traditionally been poor or nonexistent. For a couple of decades now Paul Farmer has done as much as anyone in the world to save the lives of people whom the world in general thinks not worth saving. And key to his devotion is his lifelong Catholic Christian faith.

I have other heroes. Some of them work for Care Net, an organization founded in 1975 by evangelical Christians to provide pregnant women with alternatives to abortion — and to provide counseling and compassionate attention to women who have had abortions. The people of Care Net also share the good news of the Christian gospel with the women whom they serve.

I have at times been in groups of people who know and respect the work of Care Net, but if in those contexts I mention my admiration for the work of Paul Farmer and Partners in Health, I am liable to get some suspicious looks. Paul Farmer’s theology is on the liberal end of the spectrum, as are his politics: for instance, he is a vocal admirer of the Cuban government’s health care system. He has had a book written about him by Tracy Kidder, also a political liberal and a thoroughly secular writer; Sixty Minutes, even, has done an admiring segment on him. He’s the liberal elite’s ideal do-gooder. What does that have to do with Christianity?

In other groups, people join enthusiastically in my praise for Paul Farmer — but become nervous when I mention my admiration for Care Net. They hear of an organization trying to provide women with alternatives to abortion and they think of large photographs of bloody fetuses held aloft by abortion-clinic protestors; they think of reactionaries who want to control women’s bodies and keep them barefoot and pregnant; they think of a conservative evangelical church in thrall to the Republican Party. If they saw the institutional history Care Net provides on its website they would be even more worried: “Care Net was influenced by the evangelical leadership of former U.S. Surgeon General Dr. C. Everett Koop and Christian apologist, Dr. Francis Schaeffer.”

And yet both Partners in Health and Care Net are pursuing the Biblical mandate to care for the weakest and most helpless among us. In so many ways they are doing the same work, and even are dedicated to the same goal — the preservation and healing of the lives of people made in the image of God. Why must we see them in opposition to each other? (And for all I know they may even see themselves in opposition to each other.) Such an attitude is simply tragic.

And the cause of the tragedy is this: that the categories of American politics determine the way that many American Christians think about ministry, mission, and service. The talking points and platform statements of the two major political parties provide the guidelines that many Christians use to judge things of the Gospel. Simply put, many American Christians have been intellectually formed by our political debates — especially as they are digested and interpreted on television news programs — far more than by immersion in Scripture or the great movements and figures of Christian tradition.

There have of course been attempts to bridge this political gap, primarily through Catholic social teaching. The late Joseph Cardinal Bernadin of Chicago was particularly associated with the notion of the “seamless garment” of life, the necessarily interwoven character of all attempts to promote human survival and human flourishing. Others have adopted the phrase “consistent life ethic” to indicate much the same emphasis. But often these voices have been ignored because they have been perceived as coming from a particular “side” in the already-existing political debate. (Similar criticisms have been directed against Jim Wallis’s God’s Politics: many a reviewer has commented that God’s politics seems to be indistinguishable from that of recent Democratic Party platforms.) Moreover, these movements tend to be grounded very specifically in Catholic theology, in ways that can be daunting or confusing for people who do not know that tradition well.

What is needed at this moment is a way of approaching these immensely complex yet utterly essential issues that evades our usual and comfortable political categories. We need to be taken out of our own time and place, so that we might see what a real Gospel of Life looks like to Christians who didn’t look much like us, think much like us, or live much like us. We need to travel to Cappadocia during the time of the Church Fathers. There we will find one family who, among themselves, practiced the Gospel of Life in all its fullness.

The Cappadocians. Cappadocia is a region in what is now central Turkey. In the time with which we are concerned, it was an economically and culturally vibrant place. To the north is the district of Pontus and the Black Sea. To the west are the shores of Asia Minor, then dotted with great cities — Ephesus particularly well known to us because of Paul’s visit and letter to the churches there. Due south one would have found Tarsus, Paul’s home town, whose great Mediterranean harbor was even then silting up and landlocking it. To the southeast is Palestine. In the late Roman world, Cappadocia was a great crossroads.

Sometime around the year 270, probably in the Pontus region, a woman was born named Macrina. Later she married — we don’t know her husband’s name — and had children. She was a faithful Christian, which it was not good to be in that time and place, for in the year 303, when Macrina’s son Basil would have been around eight years old, the Roman Emperor Diocletian and his colleagues Maximian, Galerius, and Constantius began issuing a series of repressive edicts aimed directly at the Christians of the empire. Eusebius, the great historian of the early church, who was around forty when the persecutions began, described the situation in this way:

This was the nineteenth year of the reign of Diocletian, in March, when the feast of the Saviour’s passion was near at hand, and royal edicts were published everywhere commanding that the churches should be razed to the ground, the Scriptures destroyed by fire, those who held positions of honor degraded, and the household servants, if they persisted in the Christian profession, be deprived of their liberty.

And such was the first decree against us. But issuing other decrees not long after, the Emperor commanded that all the rulers of the churches in every place should be first put in prison and afterwards compelled by every device to offer sacrifice [to the old Roman gods]. Elsewhere in the Empire, particularly in the far west and north, these edicts were enforced half-heartedly or not at all; but in Cappadocia and Pontus the officials pursued Christians with real, and frightening, enthusiasm. Macrina and her family fled into the countryside, probably to the shores of the Black Sea, and waited out the persecutions.

The attempts to suppress Christianity waned over the next few years and were officially ended in 313, when Constantine the Great and his co-emperor Licinius issued the Edict of Milan, which guaranteed religious freedom for Christians. Pontus and Cappadocia were safe again, and Macrina raised her children in the Pontine city of Neocaeserea. Later her son Basil would marry a devout Christian woman named Emmelia — her father had been martyred in the persecutions — and move to Ceasarea, which would become the center of this family’s life. Basil established himself as a teacher of rhetoric and, more important, a lay teacher of the Christian faith, celebrated throughout the city for his wisdom and piety.

Basil the Elder, as he came to be known, and Emmelia went on to have ten children. We know the names of five of them: Macrina, Basil, Gregory, Naucratius, and Peter. We know those five names because all of them became recognized as saints of the church — as did Basil the Elder, Emmelia, and the matriarch Macrina the Elder.

Let us pause to contemplate this for a moment: one family; three generations; eight saints.

What did these people do to earn such lasting fame and praise? Here I will offer just a brief account:

  • Macrina the Elder was canonized for her faithfulness in time of persecution;
  • Basil the Elder for his powerful and influential teaching;
  • Emmelia largely for the extraordinary family she raised, though she may also have had an active role in the ministries of some of her children;
  • Macrina the Younger for her teaching of her younger siblings, and as the founder of a community for widows, abandoned women, and orphans;
  • Basil, later Basil the Great, for his work as priest, bishop, scholar, defender of Trinitarian orthodoxy, champion of the poor and hungry, and founder of the Basileion, the first Christian hospital;
  • Gregory, later Gregory of Nyssa, for much the same work as his older brother Basil did, though on slightly less titanic a scale;
  • Naucratius, for his holy life as a hermit and as a servant of, especially, the elderly poor;
  • Peter, later Peter of Sebaste, for his work as priest, monk, bishop, defender of Trinitarian orthodoxy, and as partner with his sister Macrina on behalf of the poor.

It’s all really quite overwhelming, and, when related in detail (as this book will do), deeply moving and encouraging. What’s especially noteworthy for anyone interested in the Gospel of Life is how seamless the family’s garment of service is. This can be seen with particular clarity in the career of Basil, who moved so easily from advocating for the poor in the pulpit to building the first hospital to combatting Arianism and other heresies of the time. It was one Gospel to him — and, I think, to the rest of his family — one unified model of human flourishing in Christ.

In order to understand why the way of life offered by this single Cappadocian family is so important an example for us, we might turn to the thirty-fourth chapter of the book of the prophet Ezekiel. Here Ezekiel is lamenting the circumstances that led to the collapse of Israel and the carrying off of its people, including Ezekiel himself, into captivity in Babylon. The fault, he says — or the Lord says through him — lies with the “shepherds,” the elders of Israel. The Lord denounces the shepherds thus:

The weak you have not strengthened, the sick you have not healed, the injured you have not bound up, the strayed you have not brought back, the lost you have not sought, and with force and harshness you have ruled them. So they were scattered, because there was no shepherd, and they became food for all the wild beasts. . . . My sheep were scattered over all the face of the earth, with none to search or seek for them.

Several things are noteworthy about this passage: the powerful condemnation of sins of omission, for which most of us tend to excuse ourselves; the opposition of ruling “in force and harshness” — domination — to compassionate care; and especially the bringing together of very different forms of suffering.

There are five categories of sufferers in this passage: the weak, the sick, the injured, the strayed, and the lost. It’s pretty clear that the first three apply in equally literal ways to sheep and to people: the Lord is accusing the elders of Israel of neglecting the physical sufferings of the people. But the remaining two categories have an equally clear metaphorical meaning when applied to people, as they do throughout the prophets (“All we like sheep have gone astray”). The Lord clearly says that the elders of the nation have neglected the spiritual as well as the physical needs of the people — they have been shamefully negligent across the board, obsessed instead with maintaining their own power to rule.

The behavior of the shepherds of Israel is a lamentable inversion, then, of the gospel that our Cappadocian family practiced. What the shepherds “did not” the Cappadocians did, like the two groups Jesus contrasts to each other (“those on his right” and “those on his left”) in Matthew 25, a passage that strongly echoes Ezekiel 34. It is by striving to see ourselves in these two distant mirrors — and not by situating ourselves or others on today’s American political map — that we 21st-century Christians can find our proper roles in Christ’s Kingdom, can find the genuine Gospel of Life.

I plan to narrate the story of this great Cappadocian family immediately after the book’s introduction (which will say in more detail many of the things I say in the opening paragraphs of this proposal). The family’s story is a powerful one, and will provide a host of examples that I can draw on throughout the book.

The Image of God. The next stage will start from a question: What was the biblical-theological model that our Cappadocian family was living out? What were their key principles, their core convictions? (Note that for us, as for Basil and his fellow Catholic bishops, sound orthodox theology and reverence for Scripture are indispensable to our lives in the Kingdom, and properly inseparable from the kind of work usually called “social ministry” or “compassionate ministry” — as though it is neither social nor compassionate to preach to all the story of Christ crucified.) My answer will follow a sequence of propositions that follow from the belief that all of us are made in the image of God. As Gregory of Nazianzus said in his great sermon “On the Love of the Poor,” “Do not despise those who are stretched out on the ground as if they merit no respect…. They bear the countenance of our Savior.”

  • To be made in the image of God is to have dominion over creation.
  • To be made in the image of God is to be in communion with God and with one another.
  • Because we are fallen beings, the image of God is defaced in us, but it is not erased. We still show it imperfectly.
  • We sin against the image of God in our neighbors when we fail to see that “they bear the countenance of our Savior,” and try to exert dominion over them instead of embracing communion.
  • To follow Christ, to live in him and be conformed to his image, is to have the image of God in us restored.
  • We are made in the image of God and yet — sometimes because of sin and sometimes because of the variety of created beings — we are all in some sense weak.
  • The Biblical picture of God centers on his compassion for all forms of human weakness.
  • Insofar as we are conformed to the image of Christ, we too will exhibit that compassion for weakness.
  • And we will do so all the more because we too experience weakness and hope that people who are stronger will have compassion on us.
  • To be in communion with one another is to give and receive help appropriate to our weakness — to be blessed through giving and receiving compassionate love.
  • To be in communion with one another is to acknowledge that we are many members (organs) of the body of Christ and will therefore pursue different aspects of the one Gospel of Life.
  • We are not the primary agents in the economy of love: as the Lord says later in Ezekiel 34, “I myself will be the shepherd of my sheep, and I myself will make them lie down, declares the Lord God. I will seek the lost, and I will bring back the strayed, and I will bind up the injured, and I will strengthen the weak, and the fat and the strong I will destroy. I will feed them in justice.”

The unpacking of those propositions, and their relation to one another, will occupy more than half of this book.

What to do? The last major stage of the argument will simply ask “What do we do now?” The answer will involve five categories of action:

  • Discern the weak among us;
  • Identify the different forms of weakness;
  • Know our own weaknesses;
  • Pray to see the image of God in all other human beings;
  • Trust in the strength of the “great shepherd of the sheep” (Hebrews 13).

With our hearts encouraged by the example of one great ancient family, and our minds fortified by the theological principles they embraced, and our wills committed to the necessary forms of action, we twenty-first century Christians will be prepared to live out the real, the full, Gospel of Life.

britishmuseum:

The ancient Greeks saw the Celts as warlike peoples whose strange customs set them apart from the civilised Mediterranean world. Writing around 60–30 BC, Greek historian Diodorus Siculus described Celtic peoples wearing horned helmets into battle.

This helmet was cast into the River Thames over 2,000 years ago, perhaps as an offering to the gods. It was dredged from the River Thames at Waterloo Bridge in the early 1860s. It is the only Iron Age helmet to have ever been found in southern England, and it is the only Iron Age helmet with horns ever to have been found anywhere in Europe. Horns were often a symbol of the gods in different parts of the ancient world. This might suggest the person who wore this was a special person, or that the helmet was made for a god to wear. The helmet is made from sheet bronze pieces held together with many carefully placed bronze rivets. Its swirling decoration may have carried hidden meanings.

Ancient Greek warriors wore less elaborate headgear, like this helmet. Greek writing can still be understood, unlike the enigmatic Celtic designs on the horned helmet.

Horned helmet. River
Thames near Waterloo, London, England, 200–100 BC.

Greek helmet.
Olympia, south-western Greece, around 460 BC.

See these amazing objects in our exhibition Celts: art and identity, until 31 January 2016.

Scripture and slavery

The inability of evangelicals to agree on how slavery should be construed according to Scripture, which all treated as their ultimate religious norm, was in fact connected to the economic individualism of American society.  The recourse to arms for civil war did reflect, at the very least, a glaring weakness in republican and democratic polity.  From the outside [i.e. in Europe] it was clear that American material interests exerted a strong influence on American theological conclusions. …

Foreign commentary makes clear how tightly American religious convictions were bound to general patterns of American life.  Only because religious belief and practice had grown so strong before the [Civil War and Slavery] conflict, only because they had done so much to create the nation that went to war, did that conflict result in such a great challenge to religious belief and practice after the war.  The theological crisis of the Civil War was that while voluntary reliance on the Bible had contributed greatly to the creation of American national culture, that same voluntary reliance on Scripture led only to deadlock over what should be done about slavery. …

The issue for American history was that only two courses of action seemed open when confronting such a deadlock.  The first was the course taken in the Civil war, which effectively handed the business of the theologians over to the generals to decide by ordeal what the Bible meant. … The second [course of action], though never self-consciously adopted by all Americans in all circumstances, has been followed since the Civil War.  That course is an implicit national agreement not to base public policy of any consequence on interpretations of scripture.   The result of following that second course since the Civil War has been ambiguous.  In helping to provoke the war and greatly increase its intensity, the serious commitment to Scripture rendered itself ineffective for shaping broad policy in the public arena.  In other words, even before there existed a secularization in the United States brought on by new immigrants, scientific acceptance of evolution, the higher criticism of scripture, and urban industrialization, Protestants during the Civil War had marginalized themselves as bearers of religious perspective in the body politic.

— Mark Noll, The Civil War as a Theological Crisis. An incredibly important passage for the current moment, as pointed out at agreatercourage — see Derrick’s further reflections there.

To all of us, I believe, in the middle of the twentieth century, the Roman Empire is like a mirror in which we see reflected the brutal, vulgar, powerful yet despairing image of our technological civilization, an imperium which now covers the entire globe, for all nations, capitalist, socialist, and communist, are united in their worship of mass, technique and temporal power. What fascinates and terrifies us about the Roman Empire is not that it finally went smash but that … it managed to last for four centuries without creativity, warmth or hope.

— W. H. Auden, 1952

Erasmus and Machiavelli

(What follows is a post I wrote this morning for one of my class blogs, which are private.)

Erasmus is sometimes referred to as Erasmus of Rotterdam, but that’s primarily to distinguish him from his namesake, St. Erasmus of Formia, also known as St. Elmo, patron saint of sailors. (That Wikipedia page also says he’s the patron saint of abdominal pain, whatever that means.) But our Erasmus really wasn’t of Rotterdam at all, even if he was born there. He considered himself to belong to that international group of scholars and writers that would later come to be called the Republic of Letters. His “countrymen” were Thomas More, who happened to live in England, and Aldus Manutius, who happened to live in Venice, and Johannes Frobenius, who happened to live in Switzerland.

Centuries later, as World War II was breaking out, the poet W. H. Auden would write of that same sense of connection with people separated by citizenship in the modern nation-states:

Defenceless under the night
Our world in stupor lies;
Yet, dotted everywhere,
Ironic points of light
Flash out wherever the Just
Exchange their messages:
May I, composed like them
Of Eros and of dust,
Beleaguered by the same
Negation and despair,
Show an affirming flame.

In contrast to this highly cosmopolitan vision, we should be aware that Machiavelli is much more aware of himself as a citizen of one particular city, Florence, whose greatness he wishes to restore. He is as local and particular as Erasmus is universal and general.

And yet, when Machiavelli was in exile from Florence he wrote of how central to his life was his reading of, his conversation with, the great writers of the past. Here he sounds like the truest possible humanist:

When evening has come, I return to my house and go into my study. At the door I take off my clothes of the day, covered with mud and mire, and I put on my regal and courtly garments; and decently reclothed, I enter the ancient courts of ancient men, where, received by them lovingly, I feed on the food that alone is mine and that I was born for. There I am not ashamed to speak with them and to ask them the reason for their actions; and they in their humanity reply to me. And for the space of four hours I feel no boredom, I forget every pain, I do not fear poverty, death does not frighten me.

Machiavelli was a complicated guy.

css.php