individualism

For the past hundred years or so, we have had a vast multifarious culture industry devoted to the critique of individualism. Individualism, we have been told over and over again, is the acid bath in which all previous forms of social attachment have been dissolved.

Take — as just one example among a thousand possible ones — Allan Bloom’s famous diatribe The Closing of the American Mind (1987): 

Tocqueville describes the tip of the iceberg of advanced egalitarianism when he discusses the difficulty that a man without family lands, or a family tradition for whose continuation he is responsible, will have in avoiding individualism and seeing himself as an integral part of a past and a future, rather than as an anonymous atom in a merely changing continuum. The modern economic principle that private vice makes public virtue has penetrated all aspects of daily life in such a way that there seems to be no reason to be a conscious part of civic existence.

And:

They know the truth of Tocqueville’s dictum that “in democratic societies, each citizen is habitually busy with the contemplation of a very petty object, which is himself,” a contemplation now intensified by a greater indifference to the past and the loss of a national view of the future. The only common project engaging the youthful imagination is the exploration of space, which everyone knows to be empty. The resulting inevitable individualism, endemic to our regime, has been reinforced by another unintended and unexpected development, the decline of the family, which was the intermediary between individual and society, providing quasi-natural attachments beyond the individual, that gave men and women unqualified concern for at least some others and created an entirely different relation to society from that which the isolated individual has. 

And:

The question is whether reasonings really take the place of instincts, whether arguments about the value of tradition or roots can substitute for immediate passions, whether this whole interpretation is not just a reaction unequal to the task of stemming a tide of egalitarian, calculating individualism, which the critics themselves share, and the privileges of which they would be loath to renounce.

Bloom’s diagnosis describes a now-vanished world. Has any dominant social ethos, any regnant model of the self-in-the-world, ever died as quickly as individualism has? For indeed it is gone with the wind. 

The self as monad — the self “lost in the cosmos,” as Walker Percy described it — was blown far away by the derecho of social media. What remained was not, Lord knows, an identification with humanity or even with a body of belief, but rather merger with some amorphous body of people with the same sexual orientation, or gender identity, or race, or ethnicity, or designation as a “deplorable.” Think of how many sentences now begin with “As a [fill in the blank]” — sentences spoken and written by people who do not know how to express ideas of their own but only to begin by attaching themselves to a group and claiming the authority they perceive intrinsic to that group. 

In this environment, John Danaher and Steve Petersen’s forthcoming essay “In Defence of the Hivemind Society” was inevitable. It’s an articulation in philosophical form of what has already become a felt reality, the dominant felt reality for at least a billion people.  

The question, for me, is whether this increasingly widespread abandonment of individualism in favor of group identities can be leveraged to argue on behalf of the kinds of group identities that individualism discarded, especially the ties of family and membership in religious communities. I have my doubts, but I can’t think of anything more essential for those of a conservative disposition or of Christian faith to think about. 

So long, individualism. It wasn’t all that nice knowing you. 

folly

Folly is a more dangerous enemy to the good than evil. One can protest against evil; it can be unmasked and, if need be, prevented by force. Evil always carries the seeds of its own destruction, as it makes people, at the least, uncomfortable. Against folly we have no defence. Neither protests nor force can touch it; reasoning is no use; facts that contradict personal prejudices can simply be disbelieved indeed, the fool can counter by criticizing them, and if they are undeniable, they can just be pushed aside as trivial exceptions. So the fool, as distinct from the scoundrel, is completely self-satisfied; in fact, he can easily become dangerous, as it does not take much to make him aggressive. A fool must therefore be treated more cautiously than a scoundrel; we shall never again try to convince a fool by reason, for it is both useless and dangerous.

If we are to deal adequately with folly, we must try to understand its nature. This much is certain, that it is a moral rather than an intellectual defect. There are people who are mentally agile but foolish, and people who are mentally slow but very far from foolish — a discovery that we make to our surprise as a result of particular situations. We thus get the impression that folly is likely to be, not a congenital defect, but one that is acquired in certain circumstances where people make fools of themselves or allow others to make fools of them. We notice further that this defect is less common in the unsociable and solitary than in individuals or groups that are inclined or condemned to sociability. It seems, then, that folly is a sociological rather than a psychological problem, and that it is a special form of the operation of historical circumstances: on people, a psychological by-product of definite external factors. If we look more closely, we see that any violent display of power, whether political or religious, produces an outburst of folly in a large part of mankind; indeed, this seems actually to be a psychological and sociological law: the power of some needs the folly of the others.

— Dietrich Bonhoeffer, “After Ten Years” (1943)

René Girard, please call your office

About Alice Flaherty:

Years later, she consulted on a pilot for a television drama based on her life: doctor develops mania after personal catastrophe. Although the show never got off the ground, the experience became a roomful of mirrors and mirror neurons, for who better to teach empathy to a doctor than an actor?

“The actress playing me was trying to pick up my mannerisms. At the same time,” she said, recalling professional lessons learned, “I was trying to pick up hers, because she was much more convincing than I am — she had a little smile that was triumphant, but also just so happy for the patient. I was imitating her imitating me.”

the value of emotional resilience

“Trigger Warning: Empirical Evidence Ahead”:

Participants in the trigger warning group believed themselves and people in general to be more emotionally vulnerable if they were to experience trauma. Participants receiving warnings reported greater anxiety in response to reading potentially distressing passages, but only if they believed that words can cause harm. Warnings did not affect participants’ implicit self-identification as vulnerable, or subsequent anxiety response to less distressing content… .

Trigger warnings may inadvertently undermine some aspects of emotional resilience. Further research is needed on the generalizability of our findings, especially to collegiate populations and to those with trauma histories.

Right — but what if you don’t think that being emotionally resilient is desirable? What if emotional resilience is perceived as a failure to feel pain with sufficient intensity?

against humanism

The language that has been developed over the centuries for talking about the mental and spiritual side of life is not some feeble, amateurish “folk-psychology”. It is a highly sophisticated toolbox adapted for just that difficult purpose. The vocabularies of the sciences are also well adapted for their own purposes, but this means they cannot be used anywhere else. Physical truths can only be answers to physical questions. Indeed, the great achievement of Galileo, Newton and their friends consisted in narrowing the scope of those sciences to concentrate them on topics where their methods were wholly suitable. People who are now led by that success to treat them as a panacea for other kinds of problem are being naive.

The moral of all this is, I think, that Hitchens is simply wrong. The poison does not come from religion itself but from political misuses of it. The kinds of idea that we class as religious actually range from the excellent to the awful, from the poisonous to the most nourishing. But there is a general tendency for new imaginative ways of understanding life to emerge from religious thinking – that is, from thoughts which go beyond current human horizons. This is bound to happen simply because they have quite new kinds of truth to convey. Thus the Greeks, when they came to grasp the idea that the earth as a whole bountifully supplied all their needs, worshipped it under the name of Gaia, mother of gods and men. And thus Pythagoras, when he discovered a new mathematical order pervading phenomena from the heavens to the laws of sound, naturally conceived that order as something greater than humanity; something therefore that should be deeply venerated.

In this way many of the moral insights we value highly today – for instance, the coherence of the cosmos and the value of the individual soul, as well as the conviction that All is Number – have originally been shaped in religious contexts. If we decide to drop those contexts as obsolete we lose half the meaning of the ideas themselves.