...

Stagger onward rejoicing

Tag: history (page 1 of 2)

patriotic effusion for Independence Day

I have always, I feel, been somewhat deficient in patriotism — I just don’t have the instinct for it, somehow — but listening to the recent Rest Is History series on the American Revolution got my red-white-and-blue blood up. (I say this, by the way, as a fully paid-up Wang — i.e., member of The Rest Is History Club.) George Washington had wooden teeth (ha ha ha) — Benjamin Franklin went around London “dispensing his wisdom” (ha ha ha) — Aren’t the colonists’ complaints obviously bogus? (ha ha ha) — Why do Americans have a holiday celebrating a press release? (ha ha ha) — Thomas Jefferson is a “phrasemaker” and the American Robespierre … wait, what? I don’t seem to recall Jefferson’s presiding over a Reign of Terror.

In general, I think Brits are the least trustworthy commenters on the United States — in any venue, from the left or the right, I avoid such commentary, because I know it will be filled with overconfident generalizations and smug condescension. Both of those faults arise from the writers’ belief that they have as it were a Special Relationship with the U.S.A. and can therefore interpret it authoritatively. (People from other nations might be even more critical but they are less likely to write or speak from that particular variety of smugness.) I think it telling that when Tom and Dominic did a series on the Irish quest for Home Rule they got an Irishman to guide them — Paul Rouse, who was great — and were highly deferential to his judgments, but when it came time to cover the American Revolution they saw no need for an American perspective.

At the very end there was a brief acknowledgment that George Washington did not become a tyrant when he had the opportunity to do so, and that that’s admirable, but then they immediately went on to talk about his owning of slaves. The overall tone of the episodes converged on a kind of ironic mockery, which I found disappointing not just because I’m an American (I think) but mainly because of its inaccuracy. The leaders of that Revolutionary generation were extraordinary men — it’s almost unimaginable that one moment in history, in so small a nation, would produce figures as prodigiously and variously gifted as Franklin, Washington, Adams, Jefferson, and the ever-underrated Madison. (Though Madison was a little too young to have much of a role in the Revolution proper, he was essential thereafter.)

For a better, clearer, juster view, I would recommend John J. Ellis’s excellent Founding Brothers, but even more than that an extraordinary book that is unaccountably out of print: Garry Wills’s Cincinnatus: George Washington and the Enlightenment. No other work that I know of so fascinatingly illuminates the character of George Washington – and the way he was understood by his most thoughtful contemporaries. Thus this statue on the Lawn of the University of Virginia:

Note the fasces, which Washington is about to set aside — you can’t quite see it, but behind him is a plow. If, looking at that statue, you were to turn 180º, you’d see across the Lawn a companion statue, this of a seated Jefferson, gazing contemplatively at Washington — perhaps to admire, but perhaps to make sure Washington does indeed follow the example of Cincinnatus. For it is a model easier to invoke than to imitate.

In this spirit, we might also read an essay from 2021 by my friend Rick Gibson on Washington’s Farewell Address.

Let’s conclude by delivering to these unrepentant monarchists some home truths, as articulated by Jefferson in a letter to John Langdon (1810):

When I observed however that the king of England was a cypher, I did not mean to confine the observation to the mere individual now on that throne. The practice of kings marrying only into the families of kings, has been that of Europe for some centuries. Now, take any race of animals, confine them in idleness & inaction whether in a stye, a stable, or a stateroom, pamper them with high diet, gratify all their sexual appetites, immerse them in sensualities, nourish their passions, let every thing bend before them, & banish whatever might lead them to think, & in a few generations they become all body & no mind: & this too by a law of nature, by that very law by which we are in the constant practice of changing the characters & propensities of the animals we raise for our own purposes. Such is the regimen in raising kings, & in this way they had gone on for centuries. While in Europe, I often amused myself with contemplating the characters of the then reigning sovereigns of Europe. Louis the XVIth was a fool, of my own knowledge, & in despite of the answers made for him at his trial. The king of Spain was a fool, & of Naples the same. They passed their lives in hunting, & dispatched two couriers a week, 1000 miles, to let each other know what game they had killed the preceding days. The king of Sardinia was a fool. All these were Bourbons. The Queen of Portugal, a Braganza, was an idiot by nature & so was the king of Denmark. Their sons, as regents, exercised the powers of government. The king of Prussia, successor to the great Frederic, was a mere hog in body as well as in mind. Gustavus of Sweden, & Joseph of Austria were really crazy, & George of England you know was in a straight waistcoat.

And so on that note: All praise to the Founders, and confusion to degenerate monarchies!

flag

P.S. This is unrelated to the diatribe above, but I can’t resist adding it. Adam Smith, the scholarly guest on the podcast, comments at one point that Benjamin Franklin’s only interest in George Whitefield was in determining the range at which he could project his voice. Franklin was certainly interested in that, but not in that only. Here’s my favorite passage in the whole of Franklin’s Autobiography:

Mr. Whitefield, in leaving us, went preaching all the way thro’ the colonies to Georgia. The settlement of that province had lately been begun, but, instead of being made with hardy, industrious husbandmen, accustomed to labour, the only people fit for such an enterprise, it was with families of broken shop-keepers and other insolvent debtors, many of indolent and idle habits, taken out of the jails, who, being set down in the woods, unqualified for clearing land, and unable to endure the hardships of a new settlement, perished in numbers, leaving many helpless children unprovided for. The sight of their miserable situation inspir’d the benevolent heart of Mr. Whitefield with the idea of building an Orphan House there, in which they might be supported and educated. Returning northward, he preach’d up this charity, and made large collections, for his eloquence had a wonderful power over the hearts and purses of his hearers, of which I myself was an instance.

I did not disapprove of the design, but, as Georgia was then destitute of materials and workmen, and it was proposed to send them from Philadelphia at a great expense, I thought it would have been better to have built the house here, and brought the children to it. This I advis’d; but he was resolute in his first project, rejected my counsel, and I therefore refus’d to contribute. I happened soon after to attend one of his sermons, in the course of which I perceived he intended to finish with a collection, and I silently resolved he should get nothing from me. I had in my pocket a handful of copper money, three or four silver dollars, and five pistoles in gold. As he proceeded I began to soften, and concluded to give the coppers. Another stroke of his oratory made me asham’d of that, and determin’d me to give the silver; and he finish’d so admirably, that I empty’d my pocket wholly into the collector’s dish, gold and all.

Cities 4: Secondary Epic

My previous post discussed the way Augustine sets up his City of God as antithetical to the Aeneid. Auden’s witty poem “Secondary Epic” may be seen as a kind of pendant to Augustine’s critique. It focuses not on the prophetic narration of Anchises in Book VI, but rather on a complementary moment, the description in Book VIII of the Shield of Aeneas. About this description Auden has some questions:    

How was your shield-making god to explain
Why his masterpiece, his grand panorama
Of scenes from the coming historical drama
Of an unborn nation, war after war,
All the birthdays needed to pre-ordain
The Octavius the world was waiting for,
Should so abruptly, mysteriously stop,
What cause could he show why he didn’t foresee
The future beyond 31 B.C.,
Why a curtain of darkness should finally drop
On Carians, Morini, Gelonians with quivers,
Converging Romeward in abject file,
Euphrates, Araxes and similar rivers
Learning to flow in a latinate style,
And Caesar be left where prophecy ends,
Inspecting troops and gifts for ever?
Wouldn’t Aeneas have asked: — ‘What next?
After this triumph, what portends?’ 

And then the poem concludes, returning to Anchises: 

No, Virgil, no:
Behind your verse so masterfully made
We hear the weeping of a Muse betrayed.
Your Anchises isn’t convincing at all:
It’s asking too much of us to be told
A shade so long-sighted, a father who knows
That Romulus will build a wall,
Augustus found an Age of Gold,
And is trying to teach a dutiful son
The love of what will be in the long run,
Would mention them both but not disclose
(Surely no prophet could afford to miss,
No man of destiny fail to enjoy
So clear a proof of Providence as this)
The names predestined for the Catholic boy
Whom Arian Odovacer will depose. 

The names of that “Catholic boy”? Romulus Augustulus. What poet could resist the irony

Auden borrows the title of his poem from A Preface to Paradise Lost, in which C. S. Lewis distinguishes primary epic — poems like the Iliad and Beowulf that show no obvious awareness that what they’re doing is, you know, epic — from secondary epic, which is always aware of its tradition its inheritance. Poems like the Aeneid and Paradise Lost are always gesturing towards their predecessors to make sure you know they are indeed epics. Secondary epics tend therefore to be at least somewhat polemical, in tension with their predecessors, because after all if those predecessors has said everything and said it perfectly there would be no need for later poems. Virgil has therefore set himself up to make an argument through his narrative, an argument about the destiny of Rome and the nature of heroism, and Auden joins Augustine in pointing out that the argument doesn’t work: No poet writing in the midst of history can plausibly convince us that a historical city is eternal and that heroic service to it can therefore have eternal consequences. The Pax Romana is not a telos, it’s merely an event among other events, subject to varying interpretations and to the power of change. “No, Virgil, no.” 

Cities 3: hypothesis

Here’s the hypothesis I’m working with now: The problem with every theology of culture is that “culture” isn’t a biblical concept — isn’t clearly rooted in salvation history. And that is why I’m turning to Augustine. The idea of the two cities is deeply rooted in the biblical story and may be generative of certain important ideas that we can’t get through the use of a term like “culture.”

I think this is especially true because, as David Knowles points out, Augustine really isn’t interested in political theology, or for that matter in ecclesiology. In Book XV he says, “I classify the human race into two branches [generis]: the one consists of those who live by human standards, the other of those who live according to God’s will. I also call these two classes the two cities, speaking allegorically [mystice]. By two cities I mean two societies of human beings [duas societates hominum].” Two societies — this is what we might call a sociological or an ethnographic inquiry, and that’s much of what we’re after, or anyway I’m after, in a theology of culture. But, as James Davison Hunter says, with an emphasis on the symbols by which a given society is constituted and sustained. This is also where — see my previous post — Augustine’s application of rhetorical strategies to salvation history is especially imaginative and potent. I find remarkable and stimulating the idea that God’s providential shaping of history is a rhetorical act. For one thing, it implies that cities are in a sense rhetorical acts, saturated with symbolic and even archetypal meaning. 

Also: it’s somehow typical of Augustine that when he’s trying to think sociologically he looks first at the city that Cain founded and then at the City of God in Revelation 21, and hangs his whole inquiry on a line suspended between the two. What a peculiar and fascinating mind, and that’s why, I suppose, we keep returning to him. 

P.S. I wrote a bit about why I’m pursuing this project here over at my Buy Me a Coffee page

Jenkins

Forthcoming from my friend and colleague Philip Jenkins. A kind of intro or overview here. I’m excited that this is coming. 

Dominic Sandbrook:

There’s no way our podcast, presented by two white Oxbridge-educated middle-aged men, would be commissioned by the BBC these days. Instead of just letting us talk, they would bring in some alleged comedian to make it ‘accessible’ to younger audiences.

And not a single week would pass without the appearance of some ultra-woke U.S. academic to lecture us about slavery or to flagellate us about the imagined sins of the British Empire.

The great irony is that while the average age of Radio 4 listeners is 56, more than half of our listeners are under 34. So if the BBC want to know what’s happened to its younger audience, the answer is that they have signed up to The Rest Is History.

That’s about the size of it.

ark head

Venkatesh Rao:

One mental model for this condition is what I call ark head, as in Noah’s Ark. We’ve given up on the prospect of actually solving or managing most of the snowballing global problems and crises we’re hurtling towards. Or even meaningfully comprehending the gestalt. We’ve accepted that some large fraction of those problems will go unsolved and unmanaged, and result in a drastic but unevenly distributed reduction in quality of life for most of humanity over the next few decades. We’ve concluded that the rational response is to restrict our concerns to a small subset of local reality — an ark — and compete for a shrinking set of resources with others doing the same. We’re content to find and inhabit just one zone of positivity, large enough for ourselves and some friends. We cross our fingers and hope our little ark is outside the fallout radius of the next unmanaged crisis, whether it is a nuclear attack, aliens landing, a big hurricane, or (here in California), a big wildfire or earthquake. […] 

… it’s gotten significantly harder to care about the state of the world at large. A decade of culture warring and developing a mild-to-medium hatred for at least 2/3 of humanity will do that to you. General misanthropy is not a state conducive to productive thinking about global problems. Why should you care about the state of the world beyond your ark? It’s mostly full of all those other assholes, who are the wrong kind of deranged and insane. At least you and I, in this ark, are the right kind of deranged and insane. It’s worth saving ourselves from the flood, but those other guys can look out for themselves.

I think this is largely true, but I think some other things as well — primarily that any such retreat-to-the-ark is an inevitable response to the inflexible limits of our Dunbar’s Number minds. 

“That’s perhaps the way out — keep trying to tell stories beyond ark-scale until one succeeds in expanding your horizons again.” Nope. We don’t need our horizons expanded, we need our attention narrowed and focused. 

Ken Burns’s ‘The U.S. and the Holocaust’ – Dara Horn:

Burns has a soft spot for Franklin and Eleanor, the subjects of one of his prior films, and here he treats them with kid gloves, blaming most of the missteps on State Department antagonists. The series makes a point of establishing the bigoted, racist atmosphere of the U.S. at the time, showing Nazi rallies in New York, clips of the popular anti-Semitic broadcaster Father Charles Coughlin, and colorized footage of a Nazi-themed summer camp in New Jersey. But the film goes out of its way to outline the pros and cons of Roosevelt’s decisions, leaving his reputation intact. To be clear, Roosevelt is an American icon and deserves to remain one. The problem with this approach is less about Roosevelt (there are plenty of convincing arguments in his favor, not least that he won the war) than about how it contradicts the rest of the film’s premise. The goal of the series is seemingly to reset America’s moral compass, using hindsight to expose the costs of being a bystander. But every bystander, including Roosevelt, can explain his choices. The film’s refusal to judge the commander in chief plays into a larger political pattern: offering generosity only toward those we admire.

Or whom we perceive to be on Our Team. The whole essay is excellent, but I especially appreciate the unpacking of this point: “Democracies, for all their strengths, are ill-equipped for identifying and responding to evil.” 

how history doesn’t work

This is great from Freddie:

The bigger thing for me, beyond the death of art and criticism I mean, is just how easy it is to inspire identitarians, just what they’re willing to consider a major political success. They are the cheapest dates imaginable.

Then he quotes the headline of an article: “Disney’s black Ariel isn’t just about diverse representation. It’s also about undoing past wrongs” — and asks: 

Is it? Is it really? The article is profoundly unconvincing on this score. Yes, Disney did some racist portrayals in the past. That’s bad. I don’t see how you’re evening up the score by putting more Black people in your films, really; history doesn’t work that way. 

This is the key: “history doesn’t work that way.” History doesn’t work that way. History doesn’t work that way! Can we just grasp this point? 

“One Manner of Law,” by Marilynne Robinson:

Hugh Peters, most disparaged of Puritans, wanted to exclude poor artists from taxation. He proposed that there be peacemakers appointed to settle disputes before anyone could be arrested or imprisoned. Writing as someone who was forced to flee England under the threat of persecution, and whose fellow dissenters had experienced prison and worse, he does not call for any equivalent punishment or any punishment at all for his (temporarily) defeated persecutors, but instead for an alleviation of the punitive bent in the assertion of public authority. 

A fascinating historical essay. 

The New York Times:

Born in Birmingham, Ala., Dr. [Freeman] Hrabowski came of age in the thick of the Jim Crow era. The notion that Black children didn’t deserve a quality education brought out the fighter in the self-described “fat, nerdy kid who could only attack a math problem” at a very young age.

He was 12 when he participated in the historic Children’s March inspired by the Rev. Dr. Martin Luther King. He was among the hundreds of boys and girls arrested while they marched for equal rights, and spent five days in jail.

Dr. Hrabowski has largely declined to discuss the details of what he saw and experienced in the Birmingham jail. Some of it will forever remain unspeakable, he said. But in an interview, he recalled a visit from Dr. King.

“What you do this day will have an impact on children not yet born,” Dr. Hrabowski remembered him telling the jailed children. 

I was five at the time, living about two miles from the site of the march. Children just a few years older than me were thrown in jail. God bless Dr. Hrabowski. He has fought the good fight for a long time. 

R.I.P. Bill Russell, one of the greatest Americans of our era — the best team athlete in American history, and an icon of Black Americans’ quest for full civil rights. One not-so-random fact worth remembering: Bill Russell’s father Charlie — raised in Louisiana, as his son would be until age eight — in his childhood knew people who had been enslaved. As I keep saying: The past is not dead etc. 

The Best and the Brightest

71EJ8Iv6KEL

I’ve been reading David Halberstam’s The Best and the Brightest for the first time in 40 years or more, and returning to it after all this time my primary response is that it has been somewhat overrated.

To be fair, Halberstam has an incredibly difficult task, because in order to pursue his main goal, which is to explain how it is that “the best and the brightest” of American society, the intellectual elite of the nation – highly educated, exceptionally intelligent, shrewdly perceptive – nevertheless managed to immerse us in a quagmire in Vietnam, he has to give us huge chunks of the history of Southeast Asia in the 20th century. He chose to to do this by introducing each chunk of history only when it appears to be necessary to the stage of the narrative that he is in. The relevant history gets doled out in bits and pieces – a little bit when we’re hearing about LBJ, a little bit when we’re told about the appointment of an ambassador, a little bit when we’re learning about the relevant figures in the State Department or the Department of Defense. We hear a lot about the French in Indochina and how that involvement shaped the later American involvement before we hear about the Communist takeover in China, which was the very event that made the U.S. so willing to intervene in Southeast Asia. This kind of historical mosaic can be an effective technique – it’s what Rebecca West does in what I have said many times is the best book of the twentieth century, Black Lamb and Grey Falcon – but it is exceptionally difficult to pull off, and I don’t think Halberstam does it well at all. As I read I struggled to assemble all his little chunks of historical narrative into a coherent arc or structure.

There are also a good many errors, mostly minor. For instance, Halberstam writes of “the great English novelist Joyce Carey,” but his name is spelled Cary, he was born and mostly raised in Ireland and is therefore better described as Anglo-Irish, and I doubt anyone ever called him great. Another problem is repetition: we are twice told, in detail, the story of how General Maxwell Taylor was thought by the Kennedys to have resigned from the Eisenhower administration when in fact he retired at the end of his term.

This is a book whose thesis is strong, original, and highly significant, and whose weaknesses in exposition and development have therefore been perhaps too readily excused. I don’t totally quarrel with that perspective. The great strength of the book is the same as that of Breaking Bad: it shows how you can get from one moral condition to another radically different moral condition without ever planning or even wanting to go there. Halberstam is especially good at character sketches: he shows how people of vastly different personality types – Dean Rusk, Robert McNamara, McGeorge Bundy, JFK himself – can nevertheless all be caught up, in their different ways, in a situation that seems to have its own momentum, a momentum that could only be arrested by people of exceptional self-awareness and even more exceptional courage and decisiveness.

That’s a lesson worth learning, though, of course, no one in politics ever learns it. 

McGeorge Bundy John F Kennedy 1962

Friday 4 July 1662 (The Diary of Samuel Pepys):

Up by five o’clock, and after my journall put in order, to my office about my business, which I am resolved to follow, for every day I see what ground I get by it. By and by comes Mr. Cooper, mate of the Royall Charles, of whom I intend to learn mathematiques, and do begin with him to-day, he being a very able man, and no great matter, I suppose, will content him. After an hour’s being with him at arithmetique (my first attempt being to learn the multiplication-table); then we parted till tomorrow.

That’s right: Pepys — a graduate of St. Paul’s School in London and a Bachelor of Arts from Cambridge University (Magdalene College) —, when he began his career as a civil servant, had to hire a tutor to teach him the times tables. 

a story

In my first years at Wheaton College I had a colleague named Julius Scott. (He retired in 2000 and died in 2020. R.I.P.) Julius was a New Testament scholar, but earlier in life, in the 1960s, had been a Presbyterian pastor in Mississippi. He was raised in rural Georgia and loved the South, but he knew a good deal about our native region’s habitual sins also, and as the Civil Rights movement grew stronger and stronger, he understand that he had a reckoning to make. So he did. 

After much prayer and study of Scripture, he decided that nothing could be more clear in Scripture, and nothing more foundational to Christian anthropology, than the belief that each and every human being is made in the image of God; that every human being is my neighbor; and that to “love your neighbor as yourself” is required of us all. Julius could not, therefore, avoid the conclusion that the Jim Crow laws common to the Southern states were incompatible with the Christian understanding of what human beings are and who our neighbors are; but even if those laws proved impossible to dislodge, and even if his pastoral colleagues thought them defensible, it was surely, certainly, indubitably necessary for all churches to welcome every one of God’s children who entered their doors, and to welcome them with open arms, making no distinction on the basis of race. When his presbytery — gathering of pastors in his region — next met, Julius felt that he had to speak up and say what he believed about these matters. 

He did; and thus he entered into a lengthy season of hellish misery. He was prepared for the condemnation and shunning he received from almost every other member of the presbytery; what he wasn’t prepared for was what happened when word of his speech got out to the general public, I believe through a newspaper article: an ongoing barrage of threats against his life and the lives of his wife and children. For years, he told me, he had to sleep — and sleep came hard — with a loaded gun under his bed; the fear for his family didn’t wholly abate until he left Mississippi. (“I was afraid for my babies,” Julius said, and with those words the tears filled his eyes.) Of course he remained a pariah to most of his colleagues — and even the ones who respected him told him so in private, expressing their agreement with his theological conclusions only on condition that Julius never share their views with anyone else. 

Think about that story for a while. Please understand that it’s not an uncommon one; and please understand, further, that Julius escaped with no worse than shunning and terror because he was white. (If you want to know more about Christians in Mississippi in that era, the persecuted and the persecutors alike, I recommend Charles Marsh’s book God’s Long Summer. And if you want to know what life in that era was like in Birmingham, Alabama, where I grew up, read Diane McWhorter’s Carry Me Home.) 

Now: the next time you’re tempted to say that American Christians today experience hostility unprecedented in our nation’s history, and can escape condemnation only if they bow their knee to the dominant cultural norms; that it didn’t used to be like that, that decades ago no American Christian had to be hesitant about affirming the most elementary truths of the Christian faith — the next time you’re tempted to say all that, please, before you speak, remember Julius Scott. 

John Gallagher:

The judges asked Thiess why he had become a werewolf – what benefits did it bring to a poor man like him? He explained that many decades ago he had accepted an enchanted drink offered to him in a tavern by a ‘scoundrel’ from Marienburg. The same man had taught him a blessing invoking the sun and the moon and how to cure sick people and animals. Thiess’s knowledge of herbs and charms had made him well known among his neighbours. He recalled that he took on his new identity calmly, but ‘hadn’t thought it would involve so much evil’.

I know, right? Same with me and being a professor.

from the TLS:

On my first visit to Moscow, I met one of Lenin’s embalmers. “When I began, the body was in a poor state”, said Styopa, whose expertise was the use of electricity. Skin grafts and a new partial-vacuum glass sarcophagus had helped to inhibit decay, but Styopa’s shock treatment had reversed it. “Once every two or three months, a high-voltage charge was applied to keep up the tone. But the first time we tried it I overestimated the power needed. Lenin suddenly sat up from the table, his arms shook, and his lips started to quiver. I thought he was going to speak. It was quite a shock. After that, we reduced the voltage.”

Sidney was right

Sir Philip Sidney:

I conclude, therefore, that [the poet] excels history, not only in furnishing the mind with knowledge, but in setting it forward to that which deserves to be called and accounted good; which setting forward, and moving to well-doing, indeed sets the laurel crown upon the poet as victorious, not only of the historian, but over the philosopher, howsoever in teaching it may be questionable. For suppose it be granted — that which I suppose with great reason may be denied — that the philosopher, in respect of his methodical proceeding, teach more perfectly than the poet, yet do I think that no man is so much Philophilosophos as to compare the philosopher in moving with the poet. And that moving is of a higher degree than teaching, it may by this appear, that it is well nigh both the cause and the effect of teaching; for who will be taught, if he be not moved with desire to be taught? And what so much good doth that teaching bring forth — I speak still of moral doctrine — as that it moves one to do that which it doth teach? For, as Aristotle says, it is not Gnosis but Praxis must be the fruit; and how Praxis cannot be, without being moved to practice, it is no hard matter to consider. The philosopher shows you the way, he informs you of the particularities, as well of the tediousness of the way, as of the pleasant lodging you shall have when your journey is ended, as of the many by-turnings that may divert you from your way; but this is to no man but to him that will read him, and read him with attentive, studious painfulness; which constant desire whosoever has in him, has already passed half the hardness of the way, and therefore is beholding to the philosopher but for the other half. Nay, truly, learned men have learnedly thought, that where once reason has so much overmastered passion as that the mind has a free desire to do well, the inward light each mind has in itself is as good as a philosopher’s book; since in nature we know it is well to do well, and what is well and what is evil, although not in the words of art which philosophers bestow upon us; for out of natural conceit the philosophers drew it. But to be moved to do that which we know, or to be moved with desire to know, hoc opus, hic labor est

361 years ago today

Screen Shot 2022 01 06 at 8 12 41 AM

Later they “affirmed to the last that if they had been deceived, the Lord himself was their deceiver.” Let the reader understand.

the mirror

The good folks at Plough have produced an e-book featuring two early Christian texts, and Rowan Williams has written an introduction to it that I believe essential reading for Christians in our moment. I love this kind of piece — a clear and patient exposition of ideas from the past that never once mentions current events but brilliantly illuminates the questions that face us. Please do read it all, but here are some choice nuggets: 

  • “If you look at the eyewitness accounts of martyrdom in these early centuries – ­documents like the wonderful record of the martyrs of Scilli in North Africa in AD 180 – you can see what the real issue was. These Christians, most of them probably domestic slaves, had to explain to the magistrate that they were quite happy to pray for the imperial state, and even to pay taxes, but that they could not grant the state their absolute allegiance…. What made their demand new and shocking was that it was not made on the basis of ethnic identity, but on the bare fact of conviction and conscience. For the first time in human history, individuals claimed the liberty to define the limits of their political loyalty, and to test that loyalty by spiritual and ethical standards.”  
  • “The early Christian movement … was not revolutionary in the sense that it was trying to change the government. Its challenge was more serious: it was the claim to hold any and every government to account, to test its integrity, and to give and withhold compliance accordingly. But it would be wrong to think of this, as we are tempted to do in our era, in terms of individual conscience. It was about the right of a community to set its own standards and to form its members in the light of what had been given to them by an authority higher than the empire. The early Christians believed that if Jesus of Nazareth was ‘Lord,’ no one else could be lord over him, and therefore no one could overrule his authority.” 
  • “The theology of the early centuries thus comes very directly out of this one great central conviction about political authority: if Jesus is Lord, no one else ultimately is, and so those who belong with Jesus, who share his life through the common life of the worshiping community, have a solidarity and a loyalty that goes beyond the chance identity of national or political life…. Humans love largely because of fellow-feeling, but God’s love is such that it never depends on having something in common. The creator has in one sense nothing in common with his creation – how could he? But he is completely free to exercise his essential being, which is love, wherever he wills. And this teaches us that we too must learn to love beyond the boundaries of common interest and natural sympathy and, like God, love those who don’t seem to have anything in common with us.” 
  • “One of the lasting legacies of the early church, then, is the recognition that doctrine, prayer, and ethics don’t exist in tidy separate compartments: each one shapes the others. And in the church in any age, we should not be surprised if we become hazy about our doctrine at a time when we are less clear about our priorities as a community, or if we become less passionate about service, forgiveness, and peace when we have stopped thinking clearly about the true and eternal character of God.” 

against the state

Justin E. H. Smith:

Who among these groups is “Indigenous”? We might in this case feel this is the wrong question to ask, but this feeling may in turn help to prime us for the further realization that the encounter zone of the Slavic, Turkic, Tungusic, and Paleo-Siberian peoples is in fact fairly representative of every corner of the inhabited globe, even those we take to be the most hermetic and (therefore?) the most pristinely representative of humanity in its original state. In their half-posthumous new book, the anthropologist David Graeber (1961-2020) and the archeologist David Wengrow (1972-) suggest that “even” the pre-contact Amazonian groups we generally take to conform most closely to the definition of “tribe” or “band” were likely aware of the Andean empires to their west, and may also have had, at an earlier time, relatively complex state structures that they consciously abandoned because they were lucid enough to come to see these as inimical to human thriving. The groups Europeans first encountered in the rainforest, in other words, may also have been splinters that broke away from tyrannies, just like the Sakha fleeing the Mongols, and to some extent also like the Mountain Time Zone libertarians grumbling about the tax agents from the mythical city of Washington.

It may be that more or less all societies that appear to us as “pre-state” would be more accurately described as “post-state” — even if the people who constitute them are not in fact fleeing from the center to the margins of a real tyranny, they are nonetheless living out their statelessness as a conscious implementation of an ideal of the human good. […] 

This is the sense of Pierre Clastres’s “society against the state”: societies that lack state structures are not in the “pre-” stage of anything, but are in fact actively working to keep such structures from rising up and taking permanent hold.

I’ll be starting Graeber and Wengrow’s book soon, but I suspect that — like much I have been reading in the last couple of years — it will further incline me to the suspicion that there is no remedy for technocracy that does not rely heavily and consistently on the best practices of the anarchist tradition. It’s time for a renewal of Christian anarchism — one that begins by accepting the clear fact that Anarchy and Christianity is Jacques Ellul’s worst book, by miles. 

bad dispensations

The idea that we must choose between two intolerant illiberalisms, one on the Right and one on the Left, is, it seems to me, increasingly common today. It was also quite common in the 1930s. For instance, in 1937 the British House of Commons was debating whether or how to intervene in the Spanish Civil War, and a number of M.P.s insisted that it was necessary to choose between the Fascists and the Communists. But one Member of the House replied,

I will not pretend that, if I had to choose between Communism and Nazi-ism, I would choose Communism. I hope not to be called upon to survive in the world under a Government of either of those dispensations…. It is not a question of opposing Nazi-ism or Communism; it is a question of opposing tyranny in whatever form it presents itself; and, having a strong feeling in regard to the preservation of individual rights as against Governments, and as I do not find in either of these two Spanish factions which are at war any satisfactory guarantee that the ideas which I personally care about, and to which I have been brought up in this House to attach some importance, would be preserved, I am not able to throw myself in this headlong fashion into the risk of having to fire cannon immediately on the one side or the other of this trouble…. I cannot feel any enthusiasm for these rival creeds. 

The Member who so refused to make that choice was Winston Churchill. When many thought that liberalism and democracy were unsustainable, were not long for this world, he stood up for liberalism and democracy anyway. That was the wise course then, and it’s the wise course today. 

taverns and churches

Nicholas Orme:

Looking at the rows and rows of seats in an English church, some of them dating back to the 15th century, invites questions. Why so many? Were they ever all filled, apart from an occasional wedding or funeral? The assumption is that they were full on Sundays, at least up to 1689, while parish-church attendance was compulsory. We tend to visualise an age of faith, especially up to the Reformation: a “world we have lost.”

There were, indeed, larger medieval congregations than today. Churchgoing was a valued social occasion when, especially in the countryside, there were few others. But the rows of seats are also misleading. They were put in so that people would have their own seats rather than take whatever was available. The congregation was laid out in an order of social precedence: gentry or merchants in the chancel or side chapels, yeomanry or citizens in the front of the nave, and lesser folk behind them.

They were almost all there on Easter Day, which, up to 1549, was a compulsory day of attendance to receive one’s single annual communion. Christmas and Whit Sunday were also obligatory days, although their congregations seem to have been a little smaller.

Attendance on an ordinary Sunday in medieval England was another matter, however. Contemporaries were clear that many people were absent. A succession of archbishops and bishops raged about the fact. The poet Alexander Barclay wrote in 1508: “the stalls of the tavern are stuffed with drinkers when in the church stalls [you] shall see few or none.”

waiting

I made an interesting discovery yesterday. (I’m sure others have already noticed it, but the insight is new to me.) Many readers will know this famous passage from Martin Luther King’s “Letter from the Birmingham Jail”: 

We have waited for more than 340 years for our constitutional and God given rights. The nations of Asia and Africa are moving with jetlike speed toward gaining political independence, but we still creep at horse and buggy pace toward gaining a cup of coffee at a lunch counter. Perhaps it is easy for those who have never felt the stinging darts of segregation to say, “Wait.” But when you have seen vicious mobs lynch your mothers and fathers at will and drown your sisters and brothers at whim; when you have seen hate-filled policemen curse, kick and even kill your black brothers and sisters; when you see the vast majority of your twenty million Negro brothers smothering in an airtight cage of poverty in the midst of an affluent society; when you suddenly find your tongue twisted and your speech stammering as you seek to explain to your six year old daughter why she can’t go to the public amusement park that has just been advertised on television, and see tears welling up in her eyes when she is told that Funtown is closed to colored children, and see ominous clouds of inferiority beginning to form in her little mental sky, and see her beginning to distort her personality by developing an unconscious bitterness toward white people; when you have to concoct an answer for a five year old son who is asking: “Daddy, why do white people treat colored people so mean?”; when you take a cross county drive and find it necessary to sleep night after night in the uncomfortable corners of your automobile because no motel will accept you; when you are humiliated day in and day out by nagging signs reading “white” and “colored”; when your first name becomes “n****r,” your middle name becomes “boy” (however old you are) and your last name becomes “John,” and your wife and mother are never given the respected title “Mrs.”; when you are harried by day and haunted by night by the fact that you are a Negro, living constantly at tiptoe stance, never quite knowing what to expect next, and are plagued with inner fears and outer resentments; when you are forever fighting a degenerating sense of “nobodiness”–then you will understand why we find it difficult to wait. There comes a time when the cup of endurance runs over, and men are no longer willing to be plunged into the abyss of despair. I hope, sirs, you can understand our legitimate and unavoidable impatience.

I have taught this essay many, many times over the years, and I have always zeroed in on this passage as an excellent illustration of the use of imitative form. King wants his (largely white) readers to know what it feels like to wait … and wait … and wait … — so he makes those readers wait … and wait … and wait … for the conclusion of the 316-word sentence that’s at the heart of this paragraph.  

Here’s my discovery. Right now I’m teaching Frederick Douglass’s Narrative, and in the final chapter of that mesmerizing book he writes this, an account of his experience as an escaped slave in the North when the Fugitive Slave Act was in effect: 

I have been frequently asked how I felt when I found myself in a free State. I have never been able to answer the question with any satisfaction to myself. It was a moment of the highest excitement I ever experienced. I suppose I felt as one may imagine the unarmed mariner to feel when he is rescued by a friendly man-of-war from the pursuit of a pirate. In writing to a dear friend, immediately after my arrival at New York, I said I felt like one who had escaped a den of hungry lions. This state of mind, however, very soon subsided; and I was again seized with a feeling of great insecurity and loneliness. I was yet liable to be taken back, and subjected to all the tortures of slavery. This in itself was enough to damp the ardor of my enthusiasm. But the loneliness overcame me. There I was in the midst of thousands, and yet a perfect stranger; without home and without friends, in the midst of thousands of my own brethren — children of a common Father, and yet I dared not to unfold to any one of them my sad condition. I was afraid to speak to any one for fear of speaking to the wrong one, and thereby falling into the hands of money-loving kidnappers, whose business it was to lie in wait for the panting fugitive, as the ferocious beasts of the forest lie in wait for their prey. The motto which I adopted when I started from slavery was this — “Trust no man!” I saw in every white man an enemy, and in almost every colored man cause for distrust. It was a most painful situation; and, to understand it, one must needs experience it, or imagine himself in similar circumstances. Let him be a fugitive slave in a strange land — a land given up to be the hunting-ground for slaveholders — whose inhabitants are legalized kidnappers — where he is every moment subjected to the terrible liability of being seized upon by his fellowmen, as the hideous crocodile seizes upon his prey! — I say, let him place himself in my situation — without home or friends — without money or credit — wanting shelter, and no one to give it — wanting bread, and no money to buy it, — and at the same time let him feel that he is pursued by merciless men-hunters, and in total darkness as to what to do, where to go, or where to stay, — perfectly helpless both as to the means of defence and means of escape, — in the midst of plenty, yet suffering the terrible gnawings of hunger, — in the midst of houses, yet having no home, — among fellow-men, yet feeling as if in the midst of wild beasts, whose greediness to swallow up the trembling and half-famished fugitive is only equalled by that with which the monsters of the deep swallow up the helpless fish upon which they subsist, — I say, let him be placed in this most trying situation, — the situation in which I was placed, — then, and not till then, will he fully appreciate the hardships of, and know how to sympathize with, the toil-worn and whip-scarred fugitive slave. 

A brilliant paragraph ending with a 239-word sentence that does exactly what MLK’s still-longer sentence would do more than a century later. I can’t help but think that MLK had this passage from Douglass in mind, if only unconsciously. Where Douglass uses dashes MLK uses semicolons; where Douglass uses “let him” MLK uses “when” — but the strategy and the effect are the same: holding the reader at a point of tension that the writer will offer release from only when the point is well-made. (The ultimate example of this strategy is Wagner’s use of the Tristan chord, which he resolves after fours hours or so, but only a madman would take the business that far.) “Notice how tense you were as you were waiting for the conclusion to that sentence? Imagine that intensified and prolonged by a factor of ten thousand.”   

unknown unknowns

In the first printing of my biography of the Book of Common Prayer, I say that Thomas Cranmer was a Fellow of Jesus College, Oxford. This is incorrect. It was Jesus College, Cambridge

Now, I knew perfectly well that Cranmer was a Cambridge man. I just had a brain fart when I put him in Oxford. But the error has still nagged at me. I keep thinking: Would I have done that if I were English? That is, would the difference between Oxford and Cambridge be so vivid in an English person’s mind, especially an educated English person, that such a brain fart would be impossible? Or was it just a brain fart? Maybe in other circumstances I could with equal ease write that, say, Clarence Thomas’s J.D. is from Harvard or that FDR attended Yale.   

I can’t be sure. But the whole episode has made me more aware of all the things natives of a country know that foreigners, even affectionate and well-informed foreigners, have no clue about. My Cranmer error has had me musing about the fact that, while my academic speciality is 20th-century British literature, I may be completely, blissfully (or not so blissfully) ignorant about all sorts of matters concerning the world my writers grew up in that would be obvious to natives of their country.

Indeed I know I have such blanks in my knowledge. When I produced a critical edition of Auden’s long poem The Age of Anxiety, I annotated this passage: 

Listen courteously to us
Four reformers who have founded — why not? —
The Gung-Ho Group, the Ganymede Club
For homesick young angels, the Arctic League
Of Tropical Fish, the Tomboy Fund
For Blushing Brides and the Bide-a-wees
Of Sans-Souci, assembled again
For a Think-Fest … 

Here’s what I wrote: 

These titles are only partly explicable but are meant to suggest, ironically, that the four new acquaintances are the sort of people who would create social organizations devoted to good cheer and moral improvement. Ganymede was a beautiful young mortal who was abducted by Zeus to serve as the gods’ cupbearer; Auden may also have remembered the Junior Ganymede Club frequented by Jeeves and his fellow valets in the novels of P. G. Wodehouse. “Bide-a-wee” is a Scots phrase meaning “stay a while.” “Sans-souci” means “without care.” 

None of this is wrong … but: that excellent writer (and biographer of Auden) Richard Davenport-Hines wrote me to say that 

Sans Souci (in addition to being Frederick the Great’s summer palace at Potsdam) was together with Bide-a-Wee a common name, snobbishly mocked, given to cheap bungalows at down-market English seaside resorts to which lower-middle-class people might retire after a working life as a bank-teller, clerk in a town hall, supervisor in a small workshop, station-master on a small railroad, etc. This would be an immediate association to English readers of the 1940s, or to anyone of my generation. 

I wanted to smack myself for forgetting, or neglecting, Frederick’s summer palace, but that other stuff? I had no idea. And that is extremely distressing to me. 

Which leads me to my recently completed summer reading project: 

IMG 2558

Five thousand nine hundred and seventy-nine pages later, I know so much more than I did about the social history of postwar Britain: random people of brief notoriety, appliances, food products, radio and TV shows, catchphrases etc. etc. The dark question remaining, though, is: How much of it will I be able to remember?

Indeed: Will I even realize, when coming across an item unfamiliar to me, that I could look it up in these books? I am often haunted by a shrewd point C. S. Lewis makes in his Studies in Words: sometimes words change their meanings in ways that don’t call themselves to our attention. Using the current meanings of those words, we can make sense of old sentences — just not the sense that the authors intended, or that readers of their era would have readily identified. 

Still, I am making progress, and Sandbrook’s books, while perhaps less scholarly than Kynaston’s in some respects, are wonderfully well-written and perfectly paced. They were a joy to read. 

There’s one more problem, though. Sandbrook’s project is ongoing — he wants to keep drawing closer to the present day. But … 

  • The first book in the series appeared in 2005 and covers seven years in 892 pages; 
  • The second book in the series appeared in 2006 and covers six years in 954 pages;
  • The third book in the series appeared in 2010 and covers four years in 755 pages;
  • The fourth book in the series appeared in 2012 and covers five years in 970 pages;
  • The fifth book in the series appeared in 2019 and covers three years in 940 pages. 

At that rate of progression I will be long dead by the time Sandbrook gets to Tony Blair. 

giving breath back to the dead

Justin E. H. Smith:

History in general is easily manipulable, and can always be applied for the pursuit of present goals, whatever these may be. It has long seemed to me that one of the more noble uses of history is to help us convince ourselves of the contingency of our present categories and practices. And it is for this reason, principally, that I am not satisfied with seeing history-of-philosophy curricula and conferences “diversified” as if seventeenth-century Europe were itself subject to our current DEI directives.

One particularly undesirable consequence of such use of history for the present is that it invites and encourages your political opponents likewise to marshall it for their own present ends. And in this way history becomes just another forked node of presentist Discourse — the foreign and unassimilable lives of all of those who actually lived in 1619 or 1776 are covered over. But history, when done most rigorously and imaginatively, gives breath back to the dead, and honors them in their humanity, not least by acknowledging and respecting the things they cared about, rather than imposing our own fleeting cares on them. Eventually, moreover, a thorough and comprehensive survey of the many expressions of otherness of which human cultures are capable in turn enables us, to speak with Seamus Heaney in his elegant translation of Beowulf, to “assay the hoard”: that is, to take stock of the full range of the human, and to begin to discern the commonalities behind the differences. 

Anyone who happens to know what my most recent book was about will not be surprised at how vigorously I nod my head at this. 

Berlins

SPYberlin2

One of the best stories in Michael Ignatieff’s biography of Isaiah Berlin involves a luncheon hosted at 10 Downing Street in 1944. Berlin had been for almost the whole of the war living in the USA, socializing and schmoozing and conspiring in his inimitable fashion and then sending back briskly incisive weekly reports on American attitudes towards the war effort and towards Great Britain in general. Churchill appreciated those reports very much and was pleased to have the opportunity to meet the man who had written them. 

But when he started asking his guest questions about America, he was surprised and puzzled by the vagueness and diffidence of the answers. Eventually, having gotten nowhere and feeling a bit desperate, the Prime Minister asked him what he thought was the best thing he had written. 

Came the reply: “White Christmas.” 

Isaiah Berlin was in Washington. The P.M.’s luncheon guest, it turned out, was Irving.  

Irving Berlin jpg

Palaiphobia

I’ve been thinking a lot lately about the idea of the present moment as a Power, a power in the Pauline sense of a massively distributed, massively influential, universal agency directing the course of this world. It seems to me that the Present is a jealous God: it wants us to think only of itself, and never of the past or the future except insofar as images of them serve this instant.

But if we must think of either the past or the future, the Present prefers for us to think of the future, for two reasons: one, any future we imagine is just that, imaginary, and is really a kind of projection of our hopes and fears for our own moment. And two, thinking about the future produces anxiety, which has the effect of driving us back towards the Present, where we can be distracted from that anxiety. That is to say, the future is potentially useful to the Present in a way that the past is not.

Now, to be sure, we can interpret the past in such a way that we reinforce our current habits and attitudes and prejudices. (I have written about this in Breaking Bread with the Dead.) But this is of limited usefulness to the Present and is not really worth the risks. From the perspective of the Present, any genuine immersion in the past is likely to complicate our understanding of this moment and make it harder for us to know precisely what to do, because of all the complexity, good and bad, of the behavior of those who lived and fought and prayed and loved before us. If we learn to have compassion for those people, we just might translate that into compassion for the people who share this world with us but do not think just as we think. And that the Present cannot have.

Why does the Present not want this? Because present-mindedness is instantaneousness, it is automatic response, it is the gratification of whatever emotion happens to arise. As the poet Craig Raine has said, “all emotion is pleasurable” — this fact is the constant pole star of the Present.

These thoughts, though they’ve occupied me for a long time, were recently brought to the forefront of my mind by an essay on Harper Lee by Casey Cep, which contains this passage:

There is an important and interesting conversation happening now about the relevance of To Kill a Mockingbird to our country’s pursuit of racial justice and how we teach civic virtues like tolerance. For a long time, Lee’s novel has been one of the most banned books in the country, first criticized by conservatives who disapproved of its integrationist politics, then by liberals who disapproved of its use of racial slurs, and all along by censors of all persuasions who object to its depiction of rape and incest. Lately, though, the novel’s detractors are not calling for a ban or censorship, just retirement: taking it off of syllabi in order to make room for books by a more diverse group of authors, offering students work written with an eye to the current fight for racial justice, not one from the last century.

I don’t really care whether people keep reading To Kill a Mockingbird. What interests me about this paragraph is the idea — and it’s not necessarily Cep’s idea, just one that she rightly discerns as common — that there is a “current fight for racial justice” that’s different from “one from the last century.” But, you know, Dorothy Counts is still alive.

counts

And Ruby Bridges is still alive — indeed, just now reaching retirement age.

ruby

John Lewis, the last of the Big Six, just died a few months ago. I myself remember quite vividly the integration of Birmingham’s schools. This isn’t ancient history we’re talking about, and we shouldn’t allow the artificial convention of “centuries” deceive us into believing that it is. The story of Dorothy Counts and Ruby Bridges and John Lewis and all the rest of those amazing people who now get lumped into that comforting abstraction we call the Civil Rights Movement is our story, though the Present wants us to forget that, wants to separate us from our brothers and sisters, wants to break all chains that link us to one another — so that we can be wholly absorbed into Now and indulge our instantaneous emotions rather than reflect thoughtfully on the ways that the past is not dead, it is not even past.

The Present wants to infect us with what I have decided to call palaiphobia, from παλαιός, palaiós, old, worn out. That’s how it alienates us from one another, makes us wholly dependent on what it can offer: sentimentality and rage.


UPDATE: My friend Adam Roberts has, quite justifiably, wondered whether my coinage uses the right word. Here’s what I wrote in reply to him:

I have to say something about my decision to write of palaiphobia rather than archephobia. It was an agonizing one, I assure you. In these matters I take my bearings primarily from New Testament Greek, as you know, and of course there’s considerable overlap between the two words. When Paul writes in 2 Corinthians 5 that the old has gone and the new is here, he uses ἀρχαῖα; but when he talks about the “old leaven” in 1 Corinthians 5 he uses παλαιὰν ζύμην. As far as I can tell ἀρχαῖα and παλαιὰν would be interchangeable in those contexts. Both words can be neutral in their valence. But if you look at the overall patterns of usage, it seems that there’s something more generally disparaging about παλαιὸς, whereas there’s at least a potential dignity in ἀρχαῖα. When Paul talks, as he often does, about the “old man” that we must put off, he says παλαιὸς ἄνθρωπος. And παλαιὸς often has the connotation of something worn out, as when Jesus talks (in Matthew 9:16) about patching old clothes, ἱματίῳ παλαιῷ. I wanted to capture that disparagement in my coinage, the sense that the past is worn out, useless, of no value. I wonder if you think that makes sense? 


UPDATE 2: after a conversation with Robin Sloan:

an intractable problem

For anyone who thinks about the relations between the past and the present, intellectual maturity consists in understanding that (a) the world is always getting better in some ways and worse in other ways, and that (b) we have no reliable calculus by which we might assign a single composite value to those changes.

If Then

There are many books that I admire and love that I never for a moment dream I could have written. Right now I’m reading an old favorite, Susanna Clarke’s Jonathan Strange and Mr. Norrell, to my wife, and as I do so I am every minute aware that I could not write this book if you gave me a million years in which to do so.

But every now and then I encounter an admirable book that I wish I had written, that (if I squint just right) I see that I could have written. I had that experience a few years ago with Alexandra Harris’s Weatherland, and I’m having it again right now as I read Jill Lepore’s If Then. What a magnificent narrative. What a brilliant evocation of a moment in American history that is in one sense long gone and in another sense a complete anticipation of our own moment. Oh, the envy! (Especially since if I had written the book it wouldn’t be nearly as good as it is.)

We are afflicted by our ignorance of history in multiple ways, and one of my great themes for the past few years has been the damage that our presentism does to our ability to make political and moral judgments. It damages us in multiple ways. One of them, and this is the theme of my book Breaking Bread with the Dead, is that it makes us agitated and angry. When we, day by day and hour by hour, turn a direhose of distortion and misinformation directly into our own faces, we lose the ability to make measured judgments. We lash out against those we perceive to be our enemies and celebrate with an equally unreasonable passion those we deem to be our allies. We lack the tranquility and the “personal density” needed to make wise and balanced judgments about our fellow citizens and about the challenges we face.

But there is another and still simpler problem with our presentism: we have no idea whether we have been through anything like what we are currently going through. Some years ago I wrote about how comprehensively the great moral panic of the 1980s – the belief held by tens of millions of Americans that the childcare centers of America were run by Satan worshipers who sexually abused their charges – has been flushed down the memory hole. In this case, I think the amnesia has happened because a true reckoning with the situation would tell us so much about ourselves that we don’t want to know. It would teach us how credulous we are, and how when faced with lurid stories we lose our ability to make the most elementary factual and evidentiary discriminations. But of course our studied refusal to remember that particular event simply makes us more vulnerable to such panics today, especially given our unprecedentedly widespread self-induced exposure to misinformation.

Even more serious, perhaps, is our ignorance – in this case not so obviously motivated but the product rather of casual neglect — of the violent upheavals that rocked this nation in the 1960s and 1970s. Politicians and pastors and podcasters and bloggers can confidently assert that we are experiencing unprecedented levels of social mistrust and unrest, having conveniently allowed themselves to remain ignorant of what this country was like fifty years ago. (And let’s leave aside the Civil War altogether, since that happened in a prehistoric era.) Rick Perlstein is very good on this point, as I noted in this post.

All of this brings us back to Jill Lepore’s tale of the rise and fall of a company called Simulmatics, and the rise and rise and rise, in the subsequent half-century, of what Simulmatics was created to bring into being. Everything that our current boosters of digital technology claim for their machines was claimed by their predecessors sixty years ago. The worries that we currently have about the power of technocratic overlords began to be uttered in congressional hearings and in the pages of magazines and newspapers fifty years ago. Postwar technophilia, Cold War terror, technological solutionism, racial unrest, counterculture liberationism, and free-market libertarianism — these are the key ingredients of the mixture in which our current moment has been brewed.

Let me wrap up this post with three quotations from If Then that deserve a great deal of reflection. The first comes from early in the book:

The Cold War altered the history of knowledge by distorting the aims and ends of American universities. This began in 1947, with the passage of the National Security Act, which established the Joint Chiefs of Staff, created the Central Intelligence Agency and the National Security Council, and turned the War Department into what would soon be called the Department of Defense, on the back of the belief that defending the nation’s security required massive, unprecedented military spending in peacetime. The Defense Department’s research and development budget skyrocketed. Most of that money went to research universities — the modern research university was built by the federal government — and the rest went to think tanks, including RAND, the institute of the future. There would be new planes, new bombs, and new missiles. And there would be new tools of psychological warfare: the behavioral science of mass communications.

The second quotation describes the influence of Ithiel de Sola Pool — perhaps the central figure in If Then, one of the inventors of behavioral data science and a man dedicated to using that data to fight the Cold War, win elections, predict and forestall race riots by black people, and end communism in Vietnam — on that hero of the counterculture Stewart Brand:

Few people read Pool’s words more avidly than Stewart Brand. “With each passing year the value of this 1983 book becomes more evident,” he wrote. Pool died at the age of sixty-six in the Orwellian year of 1984, the year Apple launched its first Macintosh, the year MIT was establishing a new lab, the Media Lab. Two years later, Brand moved to Cambridge to take a job at the Media Lab, a six-story, $45 million building designed by I. M. Pei and named after Jerome Wiesner, a building that represented nothing so much as a newer version of Buckminster Fuller’s geodesic dome. Brand didn’t so much conduct research at the Media Lab as promote its agenda, as in his best-selling 1987 book, The Media Lab: Inventing the Future at M.I.T. The entire book, Brand said, bore the influence of Ithiel de Sola Pool, especially his Technologies of Freedom. “His book was the single most helpful text in preparing the book you’re reading and the one I would most recommend for following up issues raised here,” Brand wrote. “His interpretations of what’s really going on with the new communications technologies are the best in print.” Brand cited Pool’s work on page after page after page, treating him as the Media Lab’s founding father, which he was.

And finally: One of Lepore’s recurrent themes is the fervent commitment of the Simulmatics crew to a worldview in which nothing in the past matters, and that all we need is to study the Now in order to predict and control the Future:

Behavioral data science presented itself as if it had sprung out of nowhere or as if, like Athena, it had sprung from the head of Zeus. The method Ed Greenfield dubbed “simulmatics” in 1959 was rebranded a half century later as “predictive analytics,” a field with a market size of $4.6 billion in 2017, expected to grow to $12.4 billion by 2022. It was as if Simulmatics’ scientists, first called the “What-If Men” in 1961, had never existed, as if they represented not the past but the future. “Data without what-if modeling may be the database community’s past,” according to a 2011 journal article, “but data with what-if modeling must be its future.” A 2018 encyclopedia defined “what-if analysis” as “a data-intensive simulation,” describing it as “a relatively recent discipline.” What if, what if, what if: What if the future forgets its past?

Which of course it has. Which of course it must, else it loses its raison d’être. Thus the people who most desperately need to read Lepore’s book almost certainly never will. It’s hard to imagine a better case for the distinctive intellectual disciplines of the humanities than the one Lepore made just by writing If Then. But how to get people to confront that case who are debarred by their core convictions from taking it seriously, from even considering it?

the Great Crumping revisited

A surprising number of readers of my previous post have written out of concern for my state of mind, which is kind of them, but I think they have read as a cri de coeur what was meant as a simple summary of the facts. Not pleasant facts, I freely admit, but surely uncontroversial ones. Stating them so bluntly is just one element of my current period of reflection.

The primary reason I am not in despair is simply this: I know some history. I think we will probably see, in the coming decades, the dramatic reduction or elimination of humanities requirements and the closure of whole humanities departments in many American universities, but that will not mean the death of the humanities. Humane learning, literature, art, music have all thrived in places where they were altogether without institutional support. Indeed, I have suggested that it is in times of the breaking of institutions that poetry becomes as necessary as bread.

Similarly, while attendance at Episcopalian and other Anglican churches has been dropping at a steep rate for decades, and I expect will in my lifetime dwindle to nearly nothing, there will still be people worshipping with the Book of Common Prayer as long as … well, as long as there are people, I think. And if evangelicalism completely collapses as a movement — for what it’s worth, I think it already has — that will simply mean a return to an earlier state of affairs. The various flagship institutions of American evangelicalism are (in their current form at least) about as old as I am. The collapse I speak of is, or will be, simply a return to a status quo ante bellum, the bellum in question being World War II, more or less. And goodness, it’s not as if even the Great Awakening had the kind of impact on its culture, all things demographically considered, as one might suspect from its name and from its place in historians’ imaginations.

This doesn’t mean I don’t regret the collapse of the institutions that have helped to sustain me throughout my adult life. I do, very much. And I will do my best to help them survive, if in somewhat constrained and diminished form. But Put not your trust in institutions, as no wise man has even quite said, even as you work to sustain them. It’s what (or Who) stands behind those institutions and gives them their purpose that I believe in and ultimately trust.

The Great Crumping is going on all around me. But if there’s one thing that as a Christian and a student of history I know, it’s this: Crump happens.

Proudhon to Marx

Lyon, 17 May 1846:

Let us seek together, if you wish, the laws of society, the manner in which these laws are realized, the process by which we shall succeed in discovering them; but, for God’s sake, after having demolished all the a priori dogmatisms, do not let us in our turn dream of indoctrinating the people; do not let us fall into the contradiction of your compatriot Martin Luther, who, having overthrown Catholic theology, at once set about, with excommunication and anathema, the foundation of a Protestant theology. For the last three centuries Germany has been mainly occupied in undoing Luther’s shoddy work; do not let us leave humanity with a similar mess to clear up as a result of our efforts. I applaud with all my heart your thought of bringing all opinions to light; let us carry on a good and loyal polemic; let us give the world an example of learned and far-sighted tolerance, but let us not, merely because we are at the head of a movement, make ourselves the leaders of a new intolerance, let us not pose as the apostles of a new religion, even if it be the religion of logic, the religion of reason. Let us gather together and encourage all protests, let us brand all exclusiveness, all mysticism; let us never regard a question as exhausted, and when we have used our last argument, let us begin again, if need be, with eloquence and irony. On that condition, I will gladly enter your association. Otherwise — no!

Emphases mine. Edmund Wilson: “The result of this incident was that Marx was to set upon Proudhon’s new book with a ferocity entirely inconsonant with the opinion of the value of Proudhon’s earlier work which he had expressed and which he was to reiterate later” (To the Finland Station, p. 154).

readings

Gary Dorrien:

Here is where Temple still matters as a theorist of guild socialism. In the early 1940s, both before and after he became Archbishop of Canterbury, Temple got very specific about how to democratize economic power. He was incredulous that modern democracies tolerated big private banks, lamented that Christian socialists turned away in the 1890s from the land issue, and proposed a new form of guild socialism. The banks, he argued, should be turned into utilities or socialized; otherwise the rich controlled the process of investment. God made the land for everyone, and society creates the unearned increment in the value of land; therefore the increment should go to society. Above all, though Temple took for granted that certain natural monopolies must be nationalized, the centerpiece of his proposal was an excess-profits tax payable in the form of shares to worker funds. These funds, over time, would gain democratic control over enterprises. Economic democracy, he argued, can be achieved gradually, peaceably, and on decentralized terms, without abolishing economic markets or making heroic demands on the political system.


Randall Kennedy:

The ultimatum complains that, in its view, past initiatives aimed at enlarging the number of faculty of color at Princeton have “failed” because in 2019–20 “among 814 faculty, there were 30 Black, 31 Latinx, and 0 Indigenous persons. That’s 7%.” According to the ultimatum, this “is not progress by any standard; it falls woefully short of U.S. demographics as estimated by the U.S. Census Bureau, which reports Black and Hispanic persons at 32% of the total population.”

The suggestion that these statistics show racial unfairness in hiring at Princeton is misleading. According to the Journal of Blacks in Higher Education, African Americans in recent years earned only around 7 percent of all doctoral degrees. In engineering it was around 4 percent. In physics around 2 percent. Care must be taken to look for talent in places other than the familiar haunts of Ivy League searches. But even when such care is taken, the resultant catch is almost invariably quite small.

The reasons behind the small numbers are familiar and heart-breaking. They include a legacy of deprivation in education, housing, employment, and health care, not to mention increased vulnerability to crime and incarceration. The perpetuation of injuries from past discrimination as well as the imposition of new wrongs cut like scythes into the ranks of racial minorities, cruelly winnowing the number who are even in the running to teach at Princeton.

The racial demographics of its faculty does not reflect a situation in which the university is putting a thumb on the scale against racial-minority candidates. To the contrary, the university is rightly putting a thumb on the scale in favor of racial-minority candidates. That the numbers remain small reflects the terrible social problems that hinder so many racial minorities before they even have a fighting chance to enter into the elite competitions from which Princeton selects its instructors. The ultimatum denies or minimizes this pipeline problem.


Peter Brown:

Many of Ambrose’s contemporaries were quietly convinced that the ills of Roman society had a supernatural origin. Many of the sharpest critics of their age were not Christians; they were pagans. For them, bad times had begun with the “national apostasy” of Constantine. The rampant avarice denounced by pagan authors was thought to go hand in hand with the spoliation of the temples and the abandonment of the old religion.

Ambrose had to answer such views. He did so by subtly secularizing the contemporary discourse on decline. He turned what many thinking persons considered a religious crisis into a crisis of social relations. We moderns tend to applaud Ambrose for the perspicacity of his diagnosis of the weaknesses of Roman society. But pagans such as Symmachus would have regarded Ambrose’s criticisms of society as mere whistling in the dark. Symmachus knew why things had gone wrong. The moment that the first fruits of the fields of Italy that had fed the Vestal Virgins for 1,200 years were withdrawn (in 382), the link between the land and the gods was broken.

advice for journalists

Andrew Sullivan writes,

Online is increasingly where people live. My average screen time this past week was close to ten hours a day. Yes, a lot of that is work-related. But the idea that I have any real conscious life outside this virtual portal is delusional. And if you live in such a madhouse all the time, you will become mad. You don’t go down a rabbit-hole; your mind increasingly is the rabbit hole — rewired that way by algorithmic practice. And you cannot get out, unless you fight the algorithms to a draw, or manage to exert superhuman discipline and end social media use altogether. […]

In the past, we might have turned to more reliable media for context and perspective. But the journalists and reporters and editors who are supposed to perform this function are human as well. And they are perhaps the ones most trapped in the social media hellscape. You can read them on Twitter, where they live and and posture and rank themselves, or on their Slack channels, where they gang up on and smear any waverers. They’ve created an insulated world where any small dissent from groupthink is professional death. Watch Fox, CNN or MSNBC, and it’s the same story.

Point out missing facts or context, exercise some independence of judgment, push back against the narrative — and you’ll be first subject to ostracism and denunciation by your newsroom peers, and then, if you persist, you’ll be fired. The press could have been the antidote to the social media trap. Instead they chose to become the profitable pusher of the poison.

This is precisely and tragically correct.

I immediately wrote to Andrew to tell him that he needs my new book, stat. But even Andrew, who writes on a weekly basis, who has stepped back from the moment-by-moment insanity of journalistic Twitter (and from the hour-by-hour insanity of the old Dish), probably doesn’t have time to step back a bit further still over the next few weeks and read some old books.

Or doesn’t believe he has time. Maybe, and maybe for journalists more than for anyone else, this is in fact the perfect, the ideal, the necessary moment to recover “real conscious life outside this virtual portal.” One might begin with the epistles of Horace, a man who in exile from Rome learned to love the countryside. Just a thought.

lessons for activists from Edmund Burke

We do not draw the moral lessons we might from history. On the contrary, without care it may be used to vitiate our minds and to destroy our happiness. In history a great volume is unrolled for our instruction, drawing the materials of future wisdom from the past errors and infirmities of mankind…. History consists, for the greater part, of the miseries brought upon the world by pride, ambition, avarice, revenge, lust, sedition, hypocrisy, ungoverned zeal, and all the train of disorderly appetites, which shake the public with the same “troublous storms that toss / The private state, and render life unsweet.” These vices are the causes of those storms. Religion, morals, laws, prerogatives, privileges, liberties, rights of men, are the pretexts. The pretexts are always found in some specious appearance of a real good…. As these are the pretexts, so the ordinary actors and instruments in great public evils are kings, priests, magistrates, senates, parliaments, national assemblies, judges, and captains. You would not cure the evil by resolving, that there should be no more monarchs, nor ministers of state, nor of the gospel; no interpreters of law; no general officers; no public councils. You might change the names. The things in some shape must remain. A certain quantum of power must always exist in the community, in some hands, and under some appellation. Wise men will apply their remedies to vices, not to names; to the causes of evil which are permanent, not to the occasional organs by which they act, and the transitory modes in which they appear. Otherwise you will be wise historically, a fool in practice.

— Burke, Reflections on the Revolution in France (1790)

Mammon

Eugene McCarraher’s The Enchantments of Mammon is a remarkable book, beautifully written and narrated with great verve. However, I do not find its argument persuasive. You can only find the argument persuasive if you accept the book’s two key axioms, and I don’t. I say “axioms” because McCarraher doesn’t argue either of them, he takes them as … well, axiomatic.

The first axiom is that the Middle Ages in Europe, especially western Europe, constituted a great coherent sacramental order — McCarraher loves the word “sacramental” and uses it to designate pretty much everything he approves of — a “moral economy,” a “theological economy,” in which “preparation for [God’s] kingdom was the point of economic life.” Even the vast waves of rebellion and protest that occurred throughout the Middle Ages weren’t protests against that order as such, only complaints that the political and ecclesial authorities weren’t properly fulfilling their duties. McCarraher cites rebellions (e.g. Wat Tyler’s) of which this could plausibly be said as characteristic, and ignores all the more radical protests. (See Chapter 3 of Norman Cohn’s The Pursuit of the Millennium for examples.) Similarly, he presents the political-eschatological vision of Piers Plowman as though it’s typical of its age, when it isn’t typical at all, any more than William Blake’s poems are typical of England circa 1800. A fair look at what we know suggests that the Middle Ages was as governed by greed and lust and the libido dominandi as any other era, and if God had spoken to the rulers of that age he would have asked them, “What mean ye that ye beat my people to pieces, and grind the faces of the poor?” I think the great preponderance of the evidence suggests that McCarraher’s portrait of the Middle Ages is a pious fiction.

The second governing axiom of The Enchantments of Mammon is that modern capitalism, which was, he says, created by Protestantism, sacralizes wealth in a way that no previous system did. But I don’t think that’s true. In all times and all places that I know of, wealth is generally perceived as a sign of divine favor. The peculiarity of the Axial Age is that in that era there arise, in various places around the world, philosophical and theological traditions suggesting that human flourishing cannot be measured in money and possessions. (After all, one of the key features of the land that Yahweh promises to the wandering Israelites is that it flows with milk and honey, is a place in which the people can prosper economically. It’s only when we get to the book of Proverbs that we discern a serious ambivalence about all this. Proverbs 22:4: “The reward for humility and fear of the Lord is riches and honor and life.” But also Proverbs 11:4: “Riches do not profit in the day of wrath, but righteousness delivers from death.”)

At one point early on McCarraher writes, “Talking about consumerism is a way of not talking about capitalism.” This is true. However, it could equally well be said that talking about capitalism is a way of not talking about money. “The love of money is the root of all evil” was written long, long before the rise of the capitalist order that McCarraher denounces. Moreover, talking about money can be a way of not talking about human acquisitiveness, about the sin of greed. The tension between acquisitiveness and a dawning awareness that acquisitiveness isn’t everything goes as far back as the Odyssey, the great predecessor (or maybe great initiator) of the Axial Age sensibility, the first major work of art against greed, which suggests that only tragic suffering — like that of Odysseus, exiled and imprisoned far from his homeland, or Menelaus, whose family is destroyed as he gains godlike riches — is sufficient to break the hold of greed upon us. It seems to me that McCarraher is blaming capitalism for something that is a permanent feature of human life.

Anyway: If you believe that the Middle Ages were a veritable paradise of sacramental order and that the sacralizing of greed is an invention of modern capitalism — which is to say, if you believe that pretty much everything bad in the world is the result of Protestantism — then you may well find McCarraher’s case compelling. But even if you don’t accept those axioms, it’s a terrific read full of fascinating anecdotes. I’ll be thinking about it for a long time.

lessons from the past

Once I saw how things were going I got away from Twitter and stopped checking my RSS reader. I didn’t want to hear any news, mainly because I knew that there wouldn’t be much news as such — though there would be vast tracts of reason-bereft, emotionally-incontinent opinionating. And I don’t need that. Ever.

Instead, I’ve just read Ron Chernow‘s massive and quite excellent biography of George Washington. It’s been a useful as well as an enjoyable experience. For one thing, it reminds me what actual leadership looks like; and for another, it reminds me of how deeply flawed even the best leaders can be, and how profoundly wrong. As is always the case when I spend some serious time with the past, I get perspective. I see the good and the bad of my own time with more clarity and accuracy. And if anything vital has happened over the past few days while I’ve been reading, I’ve got plenty of time to catch up. As I’ve said before, I prefer to take my news a week at a time anyway, not minute by minute.

One theme in Chernow’s biography particularly sticks with me. It concerns the end of Washington’s second administration and his brief period as an ex-President (he lived only two-and-a-half years after departing the office). That was a time when when political parties in something like the modern sense of the term — though many of the Founders referred to the power and danger of “Factions” — dramatically strengthened. It was also a time when journalists who supported one faction would say pretty much anything to discredit the other one, would make up any sort of tale. John Adams, violently angered by all the lies told about him and his colleagues — and there really were lies, outrageous lies, about him, just as there were about Washington near the end of his second term — strongly supported and oversaw the passage of the Alien and Sedition Acts to punish journalistic bias and fake news, as well as to deter immigration — though Adams and his fellow in-power Federalists were only concerned to punish the lies that worked against their policies, not the lies that worked in their favor. (Plenty of those circulated also.) Washington, rather surprisingly and disappointingly, thought these laws good ones.

I draw certain lessons from this whole sordid history. Habitual dishonesty, on the part of politicians and journalists alike, inflames partisanship. You exaggerate the nastiness of your political opponents, and that leads people in your camp to think of the other side not as fellow citizens with whom you disagree on policy, but rather something close to moral monsters. Gratified by your increasingly tight bond with the like-minded, you stretch your exaggerations into outright lies, which gain even more rapturous agreement within your Ingroup and more incandescent anger against the Outgroup. Dishonesty begets partisanship, and partisanship begets further dishonesty. It’s a classic vicious circle.

And one more phenomenon is begotten by all this malice. Wheeled around in this accelerating circle of your own devising, you become incapable of comprehending, or even seeing, something that it is absolutely necessary for every wise social actor to know: In some situations all parties act badly. Sometimes there are no good guys, just fools and knaves, blinded to the true character of their own behavior by their preening partisan self-righteousness.

Lucy Ellmann and old books

We’re at the copy-editing stage of my next book, so it’s too late to add anything, but goodness, I wish I could squeeze in this from Lucy Ellmann:

Some time ago I pretty much decided to read only books written before the atom bomb was dropped, when everything changed for all life on Earth. The industrial revolution’s bad enough, but nuclear weapons really are party-poopers.

I don’t stick strictly to this policy, but I often find it more rewarding to read what people thought about, and what they did with literature, before we were reduced by war and capitalism to mere monetary units, bomb fodder and password generators. And before the natural world became a depository for plastics and nuclear waste.

Anger and alienation have resulted, and they’re fine subjects, but there are times when you’d like to remember some of the higher points in the history of civilisation as well, and the natural world before we learned to view it all as tainted. The intense humour, innocence, sexiness and play of The Life and Opinions of Tristram Shandy, Gentleman, for instance – could this have been written after Hiroshima? Could Gargantua and Pantagruel? Don Quixote? Emma? I don’t see how. Thanks to the offences of patriarchy, a lot of the fun has gone out of being human, and I like books that look at life in less constricted ways.

I love this statement because of its provocations — provocations that should be assessed with some care. For one thing, I don’t see that the variety of books has especially diminished since Hiroshima — I mean, Chicken Soup for the Soul and The Shack and a whole bunch of adult coloring books have appeared since the end of that war — and if we’re missing the distinctive (and very funny) kind of “sexiness” in Tristram Shandy I think that has a lot more to do with the sexual revolution and its unanticipated consequences than with the atom bomb. 

But the idea that before the industrial revolution, with its accompanying “war and capitalism” and reduction of persons to “mere monetary units,” the natural world was perceived in a radically different way — that’s promising. Though I think the main thing that should be said about pre-modern nature is not that it was untainted but that it was scary as shit. Which is just as much worthy of our interest and reflection. 

We could debate such matters all day — and should! Because there’s no question that the past really is another country, though they don’t everything different there. Trying to understand the continuities as well as the discontinuities is what reading books from the past is all about. 

Anyway, my book is going to be great on all that stuff. Make sure it’s the only book published in 2020 that you buy in 2020, okay? Otherwise, stick with the old stuff. You’ll be glad you did. 

Automata, Animal-Machines, and Us

What follows is a review of The Restless Clock: A History of the Centuries-Long Argument over What Makes Living Things Tick, by Jessica Riskin (University of Chicago Press, 2016). The review appeared in the sadly short-lived online journal Education and Culture, and has disappeared from the web without having been saved by the Wayback Machine. So I’m reposting it here. My thanks to that paragon of editors John Wilson for having commissioned it. 

1.

The last few decades have seen, in a wide range of disciplines, a strenuous rethinking of what the material world is all about, and especially what within it has agency. For proponents of the “Gaia hypothesis,” the whole world is a single self-regulating system, a sort of mega-organism with its own distinctive character. By contrast, Richard Dawkins dramatically shifted the landscape of evolutionary biology in his 1976 book The Selfish Gene (and later in The Extended Phenotype of 1982) by arguing that agency happens not at the level of the organism but at the level of the gene: an organism is a device by means of which genes replicate themselves.

Meanwhile, in other intellectual arenas, proponents of the interdisciplinary movement known as “object-oriented ontology” (OOO) and the movement typically linked with it or seen as its predecessor, “actor-network theory” (ANT), want to reconsider a contrast that underlines most of our thinking about the world we live in. That contrast is between humans, who act, and the rest of the known cosmos, which either behaves or is merely passive, acted upon. Proponents of OOO and ANT tend to doubt whether humans really are unique “actors,” but they don’t spend a lot of time trying to refute the assumption. Instead, they try to see what the world looks like if we just don’t make it.

In very general terms we may say that ANT wants to see everything as an actor and OOO wants to see everything as having agency. (The terms are related but, I think, not identical.) So when Bruno Latour, the leading figure in ANT, describes a seventeenth-century scene in which Robert Hooke demonstrates the working of a vacuum pump before the gathered worthies of the Royal Society, he sees Hooke as an actor within a network of power and knowledge. But so is the King, who granted to the Society a royal charter. And so is the laboratory, a particularly complex creation comprised of persons and things, that generates certain types of behavior and discourages or wholly prevents others. So even is the vacuum itself — indeed it is the status of the vacuum as actor that the whole scene revolves around.

For the object-oriented ontologist, similarly, the old line that “to a man with a hammer, everything looks like a nail” is true, but not primarily because of certain human traits, but rather because the hammer wants to pound nails. For the proponent of OOO, there are no good reasons for believing that the statement “This man wants to use his hammer to pound nails” makes any more sense than “This hammer wants to pound nails.”

Some thinkers are completely convinced by this account of the agency of things; others believe it’s nonsense. But very few on either side know that the very debates they’re conducting have been at the heart of Western thought for five hundred years. Indeed, much of the intellectual energy of the modern era has been devoted to figuring out whether the non-human world is alive or dead, active or passive, full of agency or wholly without it. Jessica Riskin’s extraordinary book The Restless Clock tells that vital and almost wholly neglected story.

It is a wildly ambitious book, and even its 500 pages are probably half as many as a thorough telling of its story would require. (The second half of the book, covering events since the Romantic era, seems particularly rushed.) But a much longer book would have struggled to find a readership, and this is a book that needs to be read by people with a wide range of interests and disciplinary allegiances. Riskin and her editors made the right choice to condense the exceptionally complex story, which I will not even try to summarize here; the task would be impossible. I can do little more than point to some of the book’s highlights and suggest some of its argument’s many implications.

2.

Riskin’s story focuses wholly on philosophers — including thinkers that today we would call “scientists” but earlier were known as “natural philosophers” — but the issues she explores have been perceived, and their implications considered, throughout society. For this reason a wonderful companion to The Restless Clock is the 1983 book by Keith Thomas, Man and the Natural World: A History of the Modern Sensibility 1500–1800, one of the most illuminating works of social history I have ever read. In that book, Thomas cites a passage from Fabulous Histories Designed for the Instruction of Children (1788), an enormously popular treatise by the English educational reformer Sarah Trimmer:

‘I have,’ said a lady who was present, ‘been for a long time accustomed to consider animals as mere machines, actuated by the unerring hand of Providence, to do those things which are necessary for the preservation of themselves and their offspring; but the sight of the Learned Pig, which has lately been shewn in London, has deranged these ideas and I know not what to think.’

This lady was not the only one so accustomed, or so perplexed; and much of the story Riskin has to tell is summarized, in an odd way, in this statement. First of all, there is the prevalence in the early modern and Enlightenment eras of automata: complex machines designed to counterfeit biological organisms that then provided many people the vocabulary they felt they needed to explain those organisms. Thus the widespread belief that a dog makes a noise when you kick it in precisely the same way for for precisely the same reasons that a horn makes a noise when you blow into it. These are “mere machines.” (And yes, it occurred to more than a few people that one could extend the logic to humans, who also make noises when kicked. The belief in animals as automata was widespread but by no means universal.)

The second point to be extracted from Trimmer’s anecdote is the lady’s belief that these natural automata are “actuated by the unerring hand of Providence” — that their efficient working is a testimony to the Intelligent Design of the world.

And the third point is that phenomena may be brought to public attention — animals trained to do what we had thought possible only by humans, or automata whose workings are especially inscrutable, like the famous chess-playing Mechanical Turk — that call into question the basic assumptions that separate the world into useful categories: the human and the non-human, the animal and all that lies “below” the animal, the animate and the inanimate. Maybe none of these categories are stable after all.

These three points generate puzzlement not only for society ladies, but also (and perhaps even more) for philosophers. This is what The Restless Clock is about.

3.

One way to think of this book — Riskin herself does not make so strong a claim, but I think it warranted — is as a re-narration of the philosophical history of modernity as a series of attempts to reckon with the increasing sophistication of machines. Philosophy on this account becomes an accidental by-product of engineering. Consider, for instance, an argument made in Diderot’s philosophical dialogue D’Alembert’s Dream, as summarized by Riskin:

During the conversation, “Diderot” introduces “d’Alembert” to such principles as the idea that there is no essential difference between a canary and a bird-automaton, other than complexity of organization and degree of sensitivity. Indeed, the nerves of a human being, even those of a philosopher, are but “sensitive vibrating strings,” so that the difference between “the philosopher-instrument and the clavichord-instrument” is just the greater sensitivity of the philosopher-instrument and its ability to play itself. A philosopher is essentially a keenly sensitive, self-playing clavichord.

But why does Diderot even begin to think in such terms? Largely because of the enormous notoriety of Jacques de Vaucanson, the brilliant designer and builder of various automata. Vaucanson is best known today for his wondrous mechanical duck, which quacked, snuffled its snout in water, flapped its wings, ate food placed before it, and — most astonishing of all to the general populace — defecated what it had consumed. (Though in fact Vaucanson had, ahead of any given exhibition of the duck’s prowess, placed some appropriately textured material in the cabinet on which the duck stood so the automaton could be made to defecate on command.) Voltaire famously wrote that without “Vaucanson’s duck, you would have nothing to remind you of the glory of France,” but he genuinely admired the engineer, as did almost everyone who encountered his projects, the most impressive of which, aside perhaps from the duck, were human-shaped automata — androids, as they were called, thanks to a coinage by the seventeenth-century librarian Gabrial Naudé — one of which played a flute, the other a pipe and drum. These devices, and the awe they produced upon observers, led quite directly to Diderot’s speculations about the philosopher as a less harmonious variety of clavichord.

And if machines could prompt philosophy, do they not have theological implications as well? Indeed they do, though, if certain stories are to be believed, the machine/theology relationship can be rather tense. In the process of coining the term “android,” Naudé relates (with commendable skepticism) the claim of a 15th-century writer that Albertus Magnus, the great medieval bishop and theologian, had built a metal man. This automaton answered any question put to it and even, some said, dictated to Albertus hundreds of pages of theology he later claimed as his own. But the mechanical theologian met a sad end when one of Albertus’s students grew exasperated by “its great babbling and chattering” and smashed it to pieces. This student’s name was Thomas Aquinas.

The story is far too good to be true, though its potential uses are so many and varied that I am going to try to believe it. The image of Thomas, the apostle of human thought and of the limits of human thought, who wrote the greatest body of theology ever composed and then at the end of his life dismissed it all as “straw,” smashing this simulacrum of philosophy, this Meccano idol — this is too perfect an exemplum not to reward our contemplation. By ending the android’s “babbling and chattering” and replacing it with patient, careful, and rigorous dialectical disputation, Thomas restored human beings to their rightful place atop the visible part of the Great Chain of Being, and refuted, before they even arose, Diderot’s claims that humans are just immensely sophisticated machines.

Yet one of the more fascinating elements of Riskin’s narrative is the revelation that the fully-worked-out idea of the human being as a kind of machine was introduced, and became commonplace, late in the 17th century by thinkers who employed it as an aid to Christian apologetics — as a way of proving that we and all of Creation are, as Sarah Trimmer’s puzzled lady put it, “actuated by the unerring hand of Providence.” Thus Riskin:

“Man is a meer piece of Mechanism,” wrote the English doctor and polemicist William Coward in 1702, “a curious Frame of Clock-Work, and only a Reasoning Engine.” To any potential critic of such a view, Coward added, “I must remind my adversary, that Man is such a curious piece of Mechanism, as shews only an Almighty Power could be the first and sole Artificer, viz., to make a Reasoning Engine out of dead matter, a Lump of Insensible Earth to live, to be able to Discourse, to pry and search into the very nature of Heaven and Earth.”

Since Max Weber in the 19th century it has been a commonplace that Protestant theology “disenchants” the world, purging the animistic cosmos of medieval Catholicism of its panoply of energetic spirits, good, evil, and ambiguous. But Riskin demonstrates convincingly that this purging was done in the name of a sovereign God in whom all spiritual power was believed to dwell. This is not simply a story of the rise of a materialist science that triumphed over and marginalized religion; rather, the world seen as a “passive mechanical artifact world relied upon a supernatural, divine intelligence. It was inseparably and in equal parts a scientific and a theological model.” So when Richard Dawkins wrote, in 2006,

People who think that robots are by definition more ‘deterministic’ than human beings are muddled (unless they are religious, in which case they might consistently hold that humans have some divine gift of free will denied to mere machines). If, like most of the critics of my ‘lumbering robot’ passage, you are not religious, then face up to the following question. What on earth do you think you are, if not a robot, albeit a very complicated one?

— he had no idea that he was echoing the argument of an early-18th-century apologist for Protestant Christianity.

Again, the mechanist position was by no means universally held, and Riskin gives attention throughout to figures who either rejected this model or modified it in interesting ways: here the German philosopher Leibniz (1646–1716) is a particularly fascinating figure, in that he admired the desire to preserve and celebrate the sovereignty of God even as he doubted that a mechanistic model of the cosmos was the way to do it. But the mechanistic model which drained agency from the world, except (perhaps) for human beings, eventually carried the day.

One of the most important sections of The Restless Clock comes near the end, where Riskin demonstrates (and laments) the consequences of modern scientists’ ignorance of the history she relates. For instance, almost all evolutionary theorists today deride Jean-Baptiste Lamarck (1744–1829), even though Lamarck was one of the first major figures to argue that evolutionary change happens throughout the world of living things, because he believed in a causal mechanism that Darwin would later reject: the ability of creatures to transmit acquired traits to their offspring. Richard Dawkins calls Lamarck a “miracle-monger,” but Lamarck didn’t believe he was doing any such thing: rather, by attributing to living things a vital power to alter themselves from within, he was actually making it possible to explain evolutionary change without (as the astronomer Laplace put it in a different context) having recourse to the hypothesis of God. The real challenge for Lamarck and other thinkers of the Romantic era, Riskin argues, was this:

How to revoke the monopoly on agency that mechanist science had assigned to God? How to bring the inanimate, clockwork cosmos of classical mechanist science back to life while remaining as faithful as possible to the core principles of the scientific tradition? A whole movement of poets, physiologists, novelists, chemists, philosophers and experimental physicists — roles often combined in the same person — struggled with this question. Their struggles brought the natural machinery of contemporary science from inanimate to dead to alive once more. The dead matter of the Romantics became animate, not at the hands of an external Designer, but through the action of a vital agency, an organic power, an all-embracing energy intrinsic to nature’s machinery.

This “vitalist” tradition was one with which Charles Darwin struggled, in ways that are often puzzling to his strongest proponents today because they do not know this tradition. They think that Darwin was exhibiting some kind of post-religious hangover when he adopted, or considered adopting, Lamarckian ideas, but, Riskin convincingly demonstrates, “insofar as Darwin adopted Lamarck’s forces of change, he did so not out of a failure of nerve or an inability to carry his own revolution all the way, but on the contrary because he too sought a rigorously naturalist theory and was determined to avoid the mechanist solution of externalizing purpose and agency to a supernatural god.”

Similarly, certain 20th-century intellectual trends, most notably the cybernetics movement (associated primarily with Norbert Wiener) and the behaviorism of B. F. Skinner, developed ideas about behavior and agency that their proponents believed no one had dared think before, when in fact those very ideas had been widely debated for the previous four hundred years. Such thinkers were either, like Skinner, proudly ignorant of what had gone before them or, like Wiener and in a different way Richard Dawkins, reliant on a history that is effectively, as Riskin puts it, “upside-down.” (Riskin has a fascinating section on the concept of “robot” that sheds much light on the Dawkins claim about human robots quoted above.)

4.

At both the beginning and end of her book, Riskin mentions a conversation with a friend of hers, a biologist, who agreed “that biologists continually attribute agency — intentions, desires, will — to the objects they study (for example cells, molecules),” but denied that this kind of language signifies anything in particular: it “was only a manner of speaking, a kind of placeholder that biologists use to stand in for explanations they can’t yet give.” When I read this anecdote, I was immediately reminded that Richard Dawkins says the same in a footnote to the 30th anniversary edition of The Selfish Gene:

This strategic way of talking about an animal or plant, or a gene, as if it were consciously working out how best to increase its success … has become commonplace among working biologists. It is a language of convenience which is harmless unless it happens to fall into the hands of those ill-equipped to understand it.

Riskin, who misses very little that is relevant to her vast argument, cites this very passage. The question which Dawkins waves aside so contemptuously, but which Riskin rightly believes vital, is this: Why do biologists find that particular language so convenient? Why is it so easy for them to fall into the “merely linguistic” attribution of agency even when they so strenuously refuse agency to the biological world? We might also return to the movements I mention at the outset of this review and ask why the ideas of OOO and ANT seem so evidently right to many people, and so evidently wrong to others.

The body of powerful, influential, but now utterly neglected ideas explored in The Restless Clock might illuminate for many scientists and many philosophers a few of their pervasive blind spots. “By recognizing the historical roots of the almost unanimous conviction (in principle if not in practice) that science must not attribute any kind of agency to natural phenomena, and recuperating the historical presence of a competing tradition, we can measure the limits of current scientific discussion, and possibly, think beyond them.”

terror and history

This excellent post by my colleague Philip Jenkins reminds us of an earlier era — just 25 years ago! — when America was worried about right-wing terrorists. As I have often pointed out — see here and here — it’s not just the distant past we’ve forgotten, it’s the very recent past. And that forgetfulness makes it very difficult for us to come up with appropriate and proportionate responses to our current problems.

chronological snobbery

The novelist Hannah Beckerman was asked, “I’m an English lit postgraduate who’s slipped into a reading rut since my final exams – what are some good books to get me back into loving literature?” Here are the first publication dates of the books she recommended:

  • 2006
  • 2016
  • 2013
  • 2010
  • 2014
  • 2015
  • 2015
  • 1995
  • 1997
  • 2000
  • 2002
  • 2017
  • 1959–1994 (Paley’s stories)
  • 1937
  • 2017
  • 2019
  • 1988
  • 1926
  • 1989
  • 1999
  • 2015

Also, all of them are written in English and by people from England, Scotland, Ireland, and the USA. The two major temporal outliers (1937 and 1926) are both children’s books, and there are no adults-only (or -primarily) novels from before 2002.

Is it really likely that all the books that might be recommended to someone who wants to “get … back into loving literature” are from our culture, our language, our time? And that none of them are poems or plays?

a road not taken

Lately I have been reading some of the wartime letters of Dorothy Sayers — who, I have just learned, pronounced her name to rhyme with “stairs” — and have been constantly reminded of something that I wrote about a bit in my Year of Our Lord 1943: the complex network, centered of course in London, of Christians working outside of standard ecclesiastical channels to bring a vibrant Christian faith before the minds of the people of England in the midst of war. People like J. H. Oldham and Philip Mairet and, perhaps above all, James Welch of the BBC — who convinced Dorothy Sayers to write the radio plays that came to be called The Man Born to be King, recruited C. S. Lewis to give the broadcast talks that became Mere Christianity, and commissioned music from Ralph Vaughan Williams — ended up having an impact on the public face of English Christianity that was enormous but is now almost completely unknown.

At one point in researching my book I thought seriously about throwing out my plans and writing this story instead — but I couldn’t bear to let go of the fascinating interplay between ideas being articulated in England and their close siblings arising in the U.S., especially in New York City.

I can’t remember whether I’ve mentioned it here before — a quick search suggests not — but I have long dreamed of writing a book called Christian London: a history of the distinctive and often profoundly influential role that London has played in the history of Christianity. However, no one I have spoken to about this project — my agent, various editors, friends — has shared my enthusiasm. I might write it one day anyhow, and if I do, people like Oldham and Mairet and Welch will be major characters in one chapter.

the most literary decade

Popular radio shows featured literary critics talking about recent poetry and fiction. The New Yorker’s book-review editor hosted one of the most popular radio shows in America, and his anthology, Reading I’ve Liked, ranked seventh on the bestseller list for nonfiction in 1941. At the Democratic Primary Convention in 1948, F. O. Matthiessen, a professor of American literature, delivered a nominating speech for Henry Wallace, the late FDR’s vice president. Writers were celebrities. Literature was popular. The 1940s was the most intensely literary decade in American history, perhaps in world history. Books symbolized freedom.

Posters of 1942 quoted the president: “Books cannot be killed by fire. People die, but books never die. No man and no force can put thought in a concentration camp forever. No man and no force can take from the world the books that embody man’s eternal fight against tyranny. In this war, we know, books are weapons.” During the Blitz, Muriel Rukeyser recalled, “newspapers in America carried full-page advertisements for The Oxford Book of English Verse, announced as ‘all that is imperishable of England.’ ” For the first and only time in history, protecting books in war zones became an official aim of armed forces.

— George Hutchinson, Facing the Abyss: American Literature and Culture in the 1940s. Compare the argument I made in my essay “The Watchmen.”

the greatest of the Wedgwoods

CVW
National Portrait Gallery

Cicely Veronica Wedgwood (1910–1997) was the most distinguished of the Wedgwoods — with the possible exception of her great-great-great grandfather Josiah. In the estimation of Anthony Grafton — himself one of the most distinguished historians of our time — she is “the greatest narrative historian of the twentieth century.” And that counts for a lot, in my book.

By the adjective “narrative” Grafton means to distinguish Wedgwood’s way of writing history from what has become the standard academic one, which is less concerned with telling a story than providing an analytical and often data-driven framework for understanding historical events and patterns. That model has become sufficiently dominant that most of us these days think of the writing of history as something very different than the writing of “literature,” but ’twas not always so. When Gibbon wrote his great Decline and Fall of the Roman Empire he was understood to be just as literary as Alexander Pope or Henry Fielding (probably more so than Fielding). In the next century Lord Macaulay’s History of England was every bit as literary as his Lays of Ancient Rome. George Steiner, writing decades ago, understood that Wedgwood was working in this tradition, and celebrates it even as he denounces what has replaced it:

The ambitions of scientific rigour and prophecy have seduced much historical writing from its veritable nature, which is art. Much of what passes for history at present is scarcely literate…. The illusion of science and the fashions of the academic tend to transform the young historian into a ferret gnawing at the minute fact or figure. He dwells in footnotes and writes monographs in as illiterate a style as possible to demonstrate the scientific bias of his craft. One of the few contemporary historians prepared to defend openly the poetic nature of all historical imagining is C. V. Wedgwood. She fully concedes that all style brings with it the possibility of distortion: ‘There is no literary style which may not at some point take away something from the ascertainable outline of truth, which it is the task of scholarship to excavate and re-establish.’ But where such excavation abandons style altogether, or harbours the illusion of impartial exactitude, it will light only on dust.

“The poetic nature of all historical imagining” — preach it, sir! But few historians today, even those rare birds who even make an effort to tell a good story, can hold a candle to Wedgwood. Peter Ackroyd, for instance, who has shifted from being primarily a novelist to being primarily a historian, presumably because of his skills at narration, is a mechanical plodder in comparison to Wedgwood. (And he’s getting worse. His ongoing history of England is coma-inducing.)

Wedgwood didn’t describe what she did as more literary or artful than the work of academic historians, even though, of course, it is. Here’s how she understood her work:

The application of modern methods of research, together with modern knowledge and prejudice, can make the past merely the subject of our own analytical ingenuity or our own illusions. With scholarly precision we can build up theories as to why and how things happened which are convincing to us, which may even be true, but which those who live through the epoch would neither recognize nor accept. It is legitimate for the historian to pierce the surface and bring to light to motives and influences not known at the time; but it is equally legitimate to accept the motives and explanations which satisfied contemporaries…. This book is not a defence of one side [in the English Civil War] or the other, not an economic analysis, not a social study; it is an attempt to understand how these men felt and why, in their own estimation, they acted as they did.

That from the Introduction to The King’s Peace, the first volume of her history of the fall of King Charles I. This interest in representing people and events by employing categories that they themselves would have recognized may be seen in this brilliant brief portrait of Charles:

He had never had the painful experience from which his father, as a young man, had learned so much; he had never confronted insolent opponents face-to-face and had the worst of the argument. No national danger had compelled him to go out among his people and share their perils. He was, at this time, not only the most formal but the most remote and sheltered of all European kings.

Less virtuous monarchs escaped from formality in the arms of lowborn mistresses, but for the chaste Charles, no Nell Gwynne, prattling cockney anecdotes, opened a window into the lives of his humbler subjects. Like many shy, meticulous men, he was fond of aphorisms, and would write in the margins of books, in a delicate, beautiful, deliberate script, such maxims as “Few great talkers are great doers” or “None but cowards are cruel.” He trusted more to such distilled and bottled essence of other man’s wisdom than to his own experience, which was, in truth, limited; his daily contact with the world was confined within the artificial circle of his Court and the hunting field. He was to say, much later, in tragic circumstances, that he knew as much law as any gentleman in England. It was true; but he had little conception of what the laws meant to those who lived under them.

That last sentence is simply devastating. (Whenever Wedgood pauses in her narrative for a character sketch of one of her actors, you are always in for something superb.)

Wedgwood pauses from time to time in The King’s Peace, especially in its early pages, to sketch not personalities but social orders and patterns and beliefs, including the experiences of those very people whom Charles never knew. Here’s a sample:

In the wilds of Lochaber from time to time a green man could be seen, one that had been killed between daylight and starlight and belonged neither to earth nor heaven. Some thought these apparitions were only the unhappy dead but others thought them one of the many forms taken by the devil. The strong forces of nature, with the advent of Christianity, had become confused with the devil, and after the lapse of centuries witchcraft had become indivisibly compact of pagan and Christian beliefs. The devil, in many forms, bestrode the islands from end to end. Sometimes he was “a proper gentleman with a laced band,” as when he came to Elizabeth Clarke at Chelmsford; at other times you might know him, as Rebecca Jones of St. Osyth did, by his great glaring eyes. He was cold and sensual and rather mean: he offered Priscilla Collit of Dunwich only ten shillings for her immortal soul; she gave it to him and off he went without paying. Respectably dressed in “brown clothes and a little black hat,” he spoke in friendly terms to Margaret Duchill of Alloa in Scotland; “Maggie, will you be my servant?” he asked, and when she agreed he told her to call him John and gave her five shillings and powers of life and death over her neighbours.

I could quote paragraph after paragraph, page after page, of this kind of thing. And her ability to immerse you in a battle, or a court scene, or a back-room political argument, is unexcelled. Moreover, all this is based (as Grafton notes) on exceptionally thorough archival research, sometimes in multiple languages. I am writing a book that attempts to convince people that the past can be, and should be, a living world to us. That case would be much easier to make if there were more historians like C. V. Wedgwood.

She should be remembered as one of the best storytellers writing in English in the twentieth century. And yet her major works — with the sole exception of the early but masterful history of the Thirty Years War — are out of print. That is a travesty.

Editors! Publishers! Let’s redress this injustice. If you need someone to edit and/or write an introduction to a new edition of Wedgwood’s work, just call on me. I am here for you.

And readers! You can of course start with The Thirty Years War, which is as good as advertised. But I prefer her work on English history. For those, AbeBooks is your friend. If you’re not sure you want to commit to one of the big books, her third volume on the English Civil War, A Coffin for King Charles, works perfectly well as an independent story and comes in at around 250 pages. It is absolutely riveting.

this is your mind on presentism

As a person writing a book about the need to cultivate temporal bandwidth, I am so pleased when various prominent cultural outlets do advance publicity on my behalf. Consider for instance this piece in the New Yorker on the decline in the study of history:

“Yes, we have a responsibility to train for the world of employment, but are we educating for life, and without historical knowledge you are not ready for life,” Blight told me. As our political discourse is increasingly dominated by sources who care nothing for truth or credibility, we come closer and closer to the situation that Walter Lippmann warned about a century ago, in his seminal “Liberty and the News.” “Men who have lost their grip upon the relevant facts of their environment are the inevitable victims of agitation and propaganda. The quack, the charlatan, the jingo … can flourish only where the audience is deprived of independent access to information,” he wrote. A nation whose citizens have no knowledge of history is asking to be led by quacks, charlatans, and jingos. As he has proved ever since he rode to political prominence on the lie of Barack Obama’s birthplace, Trump is all three. And, without more history majors, we are doomed to repeat him.

I would give a big Amen to this but with one caveat: it’s not more history majors we need, it’s a more general, more widespread, acquaintance with history. Without that we are fully at the mercy of our now-habitual and increasingly tyrannical presentism.

Consider, as an exemplum, this Farhad Manjoo column, in which he deplores the “prison” of being referred to by gendered pronouns. Damon Linker’s response zeroes in on a key point:

But what is this freedom that Manjoo and so many others suddenly crave for themselves and their children? That’s more than a little mysterious. Slaves everywhere presumably know that they are unfree, even if they accept the legitimacy of the system and the master that keeps them enslaved. But what is this bondage we couldn’t even begin to perceive in 2009 that in under a decade has become a burden so onerous that it produces a demand for the overturning of well-settled rules and assumptions, some of which (“the gender binary”) go all the way back to the earliest origins of human civilization?

I think Linker could have, with equal appositeness, referred to 2014: If you got in a time machine and showed the Farhad Manjoo of 2014 a copy of his 2019 column, he almost certainly would not believe that he had written it. A stance that in 2014 was been so uncontroversial that it didn’t rise to the level of consciousness — that it’s okay for us to refer to ourselves by gendered pronouns — is now the unmistakable sign of “a ubiquitous prison for the mind.” And yet so thoroughly is Manjoo immersed in the imperatives of the moment that he’s not even aware of the discontinuity. That is the real prison for the mind.

grass

James Madison

Last week when I was in Virginia I got to visit James Madison’s home Montpelier. Madison has long been my favorite of the Founders, but during my visit I realized that I had never read a complete biography of him. I have now remedied that by reading Richard Brookhiser’s concise and vigorous narrative, and I am moved to contemplate the extraordinary success that Madison had at guiding groups of politicians towards his preferred ends. Though he spent eight years as President and, before that, eight years as Jefferson’s Secretary of State, he belonged by temperament and character to the legislative rather than the executive branch. He was an unprepossessing figure, at just over five feet tall and a hundred pounds, and had a weak voice, but no greater committee man has ever lived. In a later era he would surely have been the greatest of American Senators.

It seems to me that there are three traits that, in combination, set Madison apart from his contemporaries and from almost every leading political figure before since.

First, he simply worked harder than anyone else. When he was chosen a delegate to the Constitutional Convention he arrived several days early to scope out the area and make relevant connections; each day of the convention — and unlike many other delegates who came and went, some of whom took lengthy vacations from the proceedings when the weather got hot, he was there every damned day — he arrived early, got a choice seat and then took incredibly extensive shorthand notes to document every single thing that happened in each of those meetings. (He would even check with other delegates to make sure that he had taken down their words accurately. This gave him a well-earned reputation for scrupulousness, which he later made good use of: he would always quote with absolute faithfulness from his notes — but was also shrewdly selective in what he chose to share. )

Second, Madison made himself the best informed person at every meeting. Even people who hated Madison acknowledged that he always had more information at his disposal than anyone else. Long before before the Convention began he wrote to Jefferson, who was in Paris, to ask him for books on government and political history. Jefferson sent two hundred volumes, which Madison devoted months to reading, annotating, and sifting. This was simply characteristic.

Third, he didn’t care who got credit. Madison was happy to let other people stand up to make noble speeches on behalf of some cause that he advocated, and to receive great applause — as long as he determined the content of those speeches.

These are all lessons worth learning, it seems to me.

Finally, I was taken by this passage from the end of Brookhiser’s biography:

Madison lies in the family cemetery, a five-minute walk from the front door of Montpelier; the graveyard was more convenient to the original house on the property, which the Madisons vacated when he was a boy. His grave is in a corner of the plot, marked by an obelisk; the shaft surmounts a blocky base, simply inscribed MADISON, along with his dates.

When it was first shown to me, I learned that the stone was not contemporary with his burial, but had been put up in 1857, twenty-one years later. What was his original marker, I asked. There was none, I was told; your marker was your family plot. Your dead relatives indicated who you were, and your living ones would remember where you were.

futurists and historians

Martin E. P. Seligman and John Tierney:

What best distinguishes our species is an ability that scientists are just beginning to appreciate: We contemplate the future. Our singular foresight created civilization and sustains society. It usually lifts our spirits, but it’s also the source of most depression and anxiety, whether we’re evaluating our own lives or worrying about the nation. Other animals have springtime rituals for educating the young, but only we subject them to “commencement” speeches grandly informing them that today is the first day of the rest of their lives.

A more apt name for our species would be Homo prospectus, because we thrive by considering our prospects. The power of prospection is what makes us wise. Looking into the future, consciously and unconsciously, is a central function of our large brain, as psychologists and neuroscientists have discovered — rather belatedly, because for the past century most researchers have assumed that we’re prisoners of the past and the present.

I wonder what evidence exists for the claim that “What best distinguishes our species is an ability that scientists are just beginning to appreciate: We contemplate the future.” What if we are more clever and resourceful readers of the past than other species? What if it was our singular power of retrospection that “created civilization and sustains society”? After all, while it’s true that we homo sapiens alone give commencement speeches, it’s also true that we homo sapiens alone build things like the Lincoln Memorial and inter our distinguished dead in places like Westminster Abbey. Why should the former count for more than the latter? 

In the preface to his translation of Thucydides (1629), Thomas Hobbes wrote that “the principal and proper work of history [is] to instruct and enable men, by the knowledge of actions past, to bear themselves prudently in the present and providently towards the future.” That is to say, validity of prospection depends upon accuracy of retrospection. Those who do not understand the past will not prepare themselves well for the future. Even if they read fizzy opinion pieces in the New York Times

Roberts’s Churchill

All my adult life I have had a strong appetite for books about Winston Churchill. It began, I suppose, when I read the first volume of William Manchester’s biography, which was frankly hagiographical but vividly told. I have read much since then, including much work highly critical of the man. I can readily understand people disliking or even hating Churchill; I could never understand someone who doesn’t find him fascinating. So, unsurprisingly, I have very much looked forward to the new biography by Andrew Roberts.

I haven’t finished it yet — so much else to do right now — but I think it’s fair to say that like Manchester’s book it is a very strong narrative, and like Manchester’s, it is largely hagiographical. I have been particularly struck by Roberts’s approach to Churchill’s eventful experience in the Great War: his chief principle seems to be that Churchill’s judgment may be questioned or even condemned, but that his character must be vindicated at all costs. Roberts accepts ungrudgingly Churchill’s many mistakes in advocating for and the overseeing the Gallipoli campaign, but he will not countenance the suggestion that Churchill fell into those mistakes because of his character flaws. Personality flaws, perhaps — impetuousness, for instance — but not flaws of moral character.

Consider this passage:

Lloyd George needed no persuasion to throw over his old friend and ally. The price of the Conservatives joining a national government was that Churchill should be sent to a sinecure post with no executive portfolio attached. ‘It is the Nemesis of the man who has fought for this war for years,’ Lloyd George told Frances Stevenson that day. ‘When the war came he saw in it the chance of glory for himself, and has accordingly entered on a risky campaign without caring a straw for the misery and hardship it would bring to thousands, in the hope that he would be the outstanding man in this war.’ There was bitterness and jealousy in that remark, but little factual accuracy.

But Roberts quotes Churchill on many occasions not only expressing excitement for the war, but demonstrating constant awareness of how his performance in it could pave the way for him to become Prime Minister. He told Violet Asquith at a dinner party in early 1915, “I know a curse should rest on me – because I love this war. I know it’s smashing and shattering the lives of thousands every moment – and yet – I can’t help it – I enjoy every second of it.” A man who could not only think this but say it out loud is a man who might legitimately be suspected of pursuing “a risky campaign without caring a straw for the misery and hardship it would bring to thousands.” He said himself that that suffering didn’t disrupt his delight in the war even for a second.

Similarly, Roberts quotes the Prince of Wales — the future King Edward VIII and, after his abdication, Duke of Windsor — saying, “It is a great relief to know that Winston is leaving the Admiralty … one does feel that he launches the country on mad ventures which are fearfully expensive both as regards men and munitions and which don’t attain their object.” To which Roberts comments, “It was for this feckless young man that Churchill would later nearly sacrifice his career.” No doubt the Prince was feckless, and would become still more so, but to describe Gallipoli as a “fearfully expensive” venture “both as regards men and munitions” that did not attain its object is to state the simple, inescapable truth.

One more example of Roberts’s habitual attitude: when the government of the Prime Minister Herbert Asquith was on the verge of collapse, and some kind of coalition had to be formed, the other parties to the coalition made it clear that they would not participate unless Churchill were sacked as First Lord of the Admiralty. Churchill wrote to Asquith to plead that he be kept on anyway, to which Asquith replied, “You must take it as settled that you are not to remain at the Admiralty … I hope to retain your services as a member of the new Cabinet, being, as I am, sincerely grateful for the splendid work you have done both before and since the war.” Roberts’s comment on that letter: “Asquith could treat Churchill harshly partly because the First Lord had so few supporters.” Treat him harshly? In no circumstances could Asquith have kept Churchill on — that had been made perfectly clear to him — so he is refusing to hold out any false hope, but nevertheless offering to keep Churchill in his Cabinet, which the other members of the coalition did not want. Asquith could do that much for Churchill only by spending some valuable political capital, and it would not have been “harsh” had Asquith banished Churchill from the government altogether.

Churchill’s attitude towards the loss of his position? “My conviction that the greatest of my work is still to be done is strong within me: and I ride reposefully along the gale. The hour of Asquith’s punishment and K[itchener]’s exposure draws nearer. The wretched men have nearly wrecked our chances. It may fall to me to strike the blow. I shall do it without compunction.”

Again, Roberts admits that Churchill makes mistakes; but he prefers to identify those mistakes himself, and when he quotes any criticism of Churchill from contemporaries, he tends to (a) dismiss it and (b) indicate his low opinion of the character of the critic. And in all this he faithfully reflects Churchill’s own interpretation of his role in the Great War.

As for me, I am inclined to agree with the journalist A. G. Gardiner, who wrote that Churchill “is always unconsciously playing a part – an heroic part. And he is himself his most astonished spectator.”

Plutarch and the end of the oracles

In my History of Disenchantment class, we’ve been discussing Plutarch’s essay on the cessation or the silence or the failure of the oracles. (The key word there, ekleloipoton, seems to be an odd one — I’m trying to learn more about it.) I read it long ago, but this is the first time I’ve taught it, and goodness, what a fascinating piece of work.

It was widely recognized in Plutarch’s time (late first and early second century A.D.) that the great oracles of the ancient world — the most famous of them being the one at Delphi, of course — had largely ceased to provide useful guidance or had fallen silent altogether. Some of the once famous shrines had been abandoned and had fallen into ruin. But no one understood why this had happened. Plutarch’s “essay” is a fictional dialogue — narrated by one Lamprias, who also takes the leading role in the conversation and may well be Plutarch’s mouthpiece — in which a group of philosophically-inclined men debate the possible reasons for the oracles’ failure.

In the opening pages of the dialogue, some of the participants deny that there is any real problem. They point to the inaccessibility of some of the shrines, and the lack of population in the surrounding areas: for them the issue is merely one of low demand leading to low supply. But this view is not widely accepted; most of the philosophers are uneasy about the oracles and feel that something is up. And after all, if low demand leads to low supply, why is there low demand? Even if the oracles are located in remote places, surely people would take the trouble to make a pilgrimage there if they believed that by doing so they could receive wise guidance for their lives.

To one of the participants, the answer to the whole problem is obvious: The gods are angry at us for our wickedness and have punitively withdrawn their guidance. But, perhaps surprisingly, no one finds this a compelling explanation. For one thing, it’s not clear to them there was any less wickedness in the earlier eras when the oracles flourished; and more to the point, are oracles given by the gods in the first place?

If they are, then shouldn’t they continue forever, since the gods themselves are immortal, unless they are specifically withdrawn? Not necessarily, says one: “the gods do not die, but their gifts do” — a line he says is from Sophocles, though I don’t know its source. Maybe the oracles lived their natural course and have now fallen silent, as one day we all will.

But what if oracles do not come from the gods, but rather from daimons? In that case the oracles might die because the daimons do. This leads to a long discussion about whether daimons are mortal, and if mortal or not whether they are necessarily good. (The one truly famous passage from this essay — in which someone recounts a story about a sailor instructed to pass by an island and cry out “The great god Pan is dead” — assumes that Pan was not a god but rather a daimon, the son of Hermes by Odysseus’s famously loyal wife Penelope.)

And this in turn leads to a very long conversation about the beings that populate the world and whether there might be other worlds populated differently and, now that we think about it, how many worlds are there anyway? (The most popular answer among the discussants: 185.) As I told my students, this is by modern standards a bizarre digression, especially since it takes up about half the dialogue, but our standards were not those of Plutarch’s time; and in any case the discussants might plausibly say that we can’t come up with a reliable solution to the puzzle of the silenced oracles unless we have a good general understanding of the kind of cosmos we live in.

In any event, the discussion eventually circles back around to the initial question, and in the final pages Lamprias gets the chance to develop the argument that he has been hinting at all along. In brief, he contends that oracles are always situated in or near caves because from those caves issue “exhalations of the earth”; and that certain people with natural gifts and excellent training of those gifts may be sensitized to the character of those exhalations, and in that way come to some intuitive and not-easily-verbalized awareness of what the world has in store for people. It’s almost a Gaia hypothesis, this idea that the world as a whole acts in certain fixed ways, and those “exhalations” attest to the more general movements of the planet. But these processes are, like all processes in Nature, subject to change over time. As a spring might dry up, or a river after flooding alter its course, so too the conditions for such exhalations might change so that there is nothing for even the most exquisitely sensitive and perfectly trained priestess to respond to.

The first and overwhelming response to Lamprias’s explanation is: Impiety! One of the interlocutors comments that first we rejected the gods in favor of daimons, and now we’re rejecting daimons in favor of a purely natural process. That is, Lamprias’s position is fundamentally disenchanting. To this Lamprias replies that his position is not impious at all, because they had all agreed earlier that in addition to humans and daimons and gods, none of whom create anything, we also have, abobe and beyond all, The God, “the Lord and Father of All,” and He is he first cause of all things, including exhalations of the earth and priestesses.

But whether it’s impious or not, Lamprias’s account is disenchanting, because it removes power from spirits and gods and concentrates them in a single transcendent Monad. His monotheism is a big step towards the religion of Israel, which tells us in the very first words of its Scriptures that the sun and moon and stars are not deities at all, but rather things made by YHWH, who alone merits our worship. Lamprias’s position, like that of the Jews, looks to those accustomed to polytheism as a kind of atheism. And by their standards that’s just what it is.

how Steve Reich discovered his own Judaism

I was brought up a secular, Reform Jew, which means I didn’t know Aleph from Bet. I knew nothing, and therefore I cared nothing. My father cared culturally, but that’s all. So when I came home from Africa, I thought to myself, there’s this incredible oral tradition in Ghana, passed on from father to son, mother to daughter, for thousands of years. Don’t I have something like that? I’m a member of the oldest group of human beings still known as a group that managed to cohere enough to survive – and I know nothing about it. So I started studying at Lincoln Square Synagogue in midtown Manhattan, an Orthodox temple, that had an incredible adult-education program for the likes of me – and I asked whether they would teach a course in biblical Hebrew, and they said sure, and they brought a professor down from Yeshiva University to teach that, and I studied the weekly portion – I didn’t even know there was such a thing as a weekly portion and commentaries thereon.

So this whole world opened up for me – it was 1975, at about the same time as I met my wife, Beryl, and so all of this sort of came together and it did occur to me – isn’t it curious that I had to go to Ghana to go back to my own traditions because I think if you understand any historical group, or any other religion for that matter, in any detail, then you’ll be able to approach another one with more understanding. So the answer to your question is yes. The longest yes you’ve ever heard.

Steve Reich

css.php