NOTE: Since the publication of my book The Pleasures of Reading in an Age of Distraction, I have had a good many questions about my own habits of reading, my own history as a reader. I didn’t say a lot about these matters in the book, though at times in the writing I considered it. Instead I decided to tell my story in a long essay that I first offered as a Kindle document and am now (somewhat belatedly) posting here. This account goes a long way, I think, towards explaining why I think about reading the way I do — and why I mistrust the attitudes towards reading taken by many of my fellow English professors. And one more thing: Those of you who have read more than a bit of my work know that I am deeply interested in theology. But theology is absent from this essay, as it was from The Pleasures of Reading. One day I think I will write an account of the role I see theology playing in the issues I have raised here and in my book. But today is not that day.

I grew up in a house filled with books, but by the standards of the cultured they weren’t good books. Also, we had no bookshelves, which meant that you couldn’t set a glass on a table without first moving a cheap paperback of some kind, or, more likely, several of them: I spent much of my childhood making neat stacks of Erle Stanley Gardner, Robert A. Heinlein, Ellery Queen, Grace Livingston Hill, Barbara Cartland, Louis L’Amour, Max Brand, Rex Stout. My father read the Westerns and science fiction, my mother the romances, my grandmother the mysteries. I disdained love stories, but the rest I consumed.

We were perhaps an atypical bunch of readers, not normal even among readers of schlock. I don’t believe any of my grandparents graduated from high school. Both of my parents took equivalency exams instead of earning high school diplomas: my father because he lied about his age and enrolled in the Navy in 1943, when he was sixteen, and stayed in the service for eleven years, during which he found time to take tests by mail; my mother because she was the brightest girl in the little country town in northern Alabama she grew up in, and so was waved through all the grades available by age fifteen. My younger sister ended up earning her high-school diploma but never considered going further. Everyone in my family except me is educated far beneath their intelligence — but this, I must admit, is not because they confronted roadblocks to their ambition. None of my family ever thought education especially valuable, and when it eventually came time for me to tell my parents that I wanted to go to college they just looked at me blankly. My interest was, to them, an incomprehensible eccentricity.

I might also add that in our house the television was on all the time — and I mean quite literally all the time, at least after my father returned from several years in prison. That happened when I was ten or eleven, and I soon learned that his strictest rule was that the TV stayed on. He didn’t switch it off when he went to bed — he was always up later than anyone else because he worked a late shift as a dispatcher for a trucking company — and I distinctly remember our leaving the TV on when we took our invariably brief vacations to the Florida panhandle. I think I worried that the TV would be lonely while we were gone, having only itself to talk to.

Yet no one in our household ever watched the television. We had no favorite shows, and the only program I can remember anyone paying attention to was the Huntley-Brinkley Report on NBC: when it came on — announcing its arrival with the opening of the second movement of Beethoven’s Ninth Symphony — someone would turn up the volume, and then turn it back down again when the news was done. The rest of the time everyone just read. (Everyone except my sister, who probably didn’t read a book all the way through until she was in her forties and has since become a voracious reader.) When I close my eyes and try to remember the house I grew up in, what arises in my mind is an image of several people sitting on shabby old chairs and sofas in the salmon-pink living room, all absorbed in their books, with a cathode-ray tube in a wooden box humming and glowing from a corner.

The motley collection of mass-market paperbacks that littered our house — plus one other I will mention — dominated my childhood. I learned to read early, via the ministrations of Dr. Seuss, and then went straight to the pulp fiction, never encountering any of the classics of children’s literature until I had a child of my own. (Wes and I discovered together The Very Hungry Caterpillar, The Wind In the Willows, Charlotte’s Web. I had never encountered any of them before.) When I was six my favorite writer was Heinlein; not the more ambitious think-books of his later career, like Stranger In a Strange Land, which I picked up and quickly set aside in puzzlement, but rather the space-age boy’s-own-paper potboilers like Tunnel in the Sky and Starship Troopers. For some years Tunnel in the Sky — an interplanetary survivalist tale, a futuristic steroidal Boy Scouts’ Manual — was my favorite story in the whole world.

I cannot be sure, at this late date, whether my interest in science fiction was a cause or a result of my other chief fascination, astronomy. I do remember that my most cherished possession was The Golden Book of Astronomy, in part, I think, because the clerk in the book department of Pizitz in downtown Birmingham, where my grandmother bought it for me, was reluctant to sell it to us, believing that I was too young to read it.  My grandmother handed me the book and ordered me to read aloud from it, which I did, eliciting widened eyes and lifted brows from the clerk, which received in turn a glare of triumph from Grandma. So I must have been quite young, which makes me think that this memorable event may have occurred before I even discovered Heinlein.

At any rate, from that time until my freshman year in college, my answer to anyone who inquired what I wanted to be when I grew up was invariant: I wanted to be an astronomer. It was only when I began to explore calculus that I realized what a grave error God had made in setting the heavenly bodies into motion, and irregular and unpredictable motion at that: no circular orbits, only variously elongated ellipses along which planets moved at different rates, spinning at different speeds. There was no thought then of demoting Pluto from the ranks of planets, but how odd to learn that it isn’t always farther from the sun than Neptune, just usually; and odder still to discover that Neptune’s big moon Triton stubbornly proceeds in its retrograde career, opposite the planet’s own rotation and the rotation of its other moons. Weird.

All this made it difficult, and indeed beyond my mathematical capacities, to plot and grasp the motions of the heavenly bodies. My vision of an astronomical career suddenly and permanently collapsed into itself, leaving a black hole in the galaxy of my mind.

What remained was a pleasure in science fiction — and an increasing interest in good writing about science. I discovered Carl Sagan, who was witty and vivid; but far, far more important, through the epigraphs to some of Sagan’s chapters I discovered Loren Eiseley. From age fourteen to eighteen, perhaps, Eiseley was my favorite writer. tncrvwb I read the essays in The Night Country again and again, and thanks to Eiseley I came early in my apprenticeship as a writer to see the essayist’s craft as a worthwhile one — indeed, the only one that fit my mind, that gave me a real chance to succeed. (I wrote fiction throughout high school but always knew that I was no good at it.)

I believe that the first essay by Eiseley I read was “Barbed Wire and Brown Skulls,” which explains why Eiseley (a paleontologist and archeologist by profession) kept skulls in his office. It ends with these words:

Generally I can’t refuse skulls that are offered to me. It is not that I am morbid, or a true collector, or that I need many of them in my work. It is just that in most cases, people being what they are, I know the skulls are safer with me. Call it a kind of respect for the bones, ingrained through long habit. That, I guess, is the reason I keep those two locked in the filing cabinet — they are delicate, and not in a position to defend themselves. So I look out for them. I’d do as much for you.

When I read those words for the first time, I barked a laugh and felt a chill. I knew immediately that this was the stuff for me.

The style of that essay is more straightforward, and its tone less melancholy, than is typical of Eiseley. His high style — which I first discovered in The Firmament of Time, his account of how geologists exploded our sense of the scope of history, and above all in The Immense Journey — put me off at first but eventually transfixed me.

Once in a lifetime, perhaps, one escapes the actual confines of the flesh. Once in a lifetime, if one is lucky, one so merges with sunlught and air and running water that whole eons, the eons that mountains and deserts know, might pass in a single afternoon without discomfort. The mind has sunk away into its beginnings among old roots and the obscure tricklings and movings that stir inanimate things. Like the charmed fairy circle into which a man once stepped, and upon emergence learned that a whole century had passed in a single night, one can never quite define this secret; but it has something to do, I am sure, with common water. Its substance reaches everywhere; it touches the past and prepares the future; it moves under the poles and wanders thinly in the heights of air. It can assume forms of exquisite perfection in a snowflake, or strip the living to a single shining bone cast up by the sea.

I did not see then that the style could at times be overwrought, that Eiseley occasionally pushed against the boundaries of what (modern) good taste allows. Later, with exposure to more restrained stylists, I began to suspect that my love of Eiseley’s prose was misplaced — but then, in one of my last college English classes, I discovered the baroque extravaganza that is the prose of Sir Thomas Browne, and immediately saw that this was Eiseley’s great model, and that in comparison to Browne Eiseley was a model of restraint.

Oblivion is not to be hired: The greater part must be content to be as though they had not been, to be found in the Register of God, not in the record of man. Twenty-seven Names make up the first story, and the recorded names ever since contain not one living Century. The number of the dead long exceedeth all that shall live. The night of time far surpasseth the day, and who knows when was the Æquinox? Every house addes unto that current Arithmetique, which scarce stands one moment. And since death must be the Lucina of life, and even Pagans could doubt whether thus to live, were to dye. Since our longest Sunne sets at right descensions, and makes but winter arches, and therefore it cannot be long before we lie down in darknesse, and have our lights in ashes. Since the brother of death daily haunts us with dying memento’s, and time that grows old it self, bids us hope no long duration: Diuturnity is a dream and folly of expectation.

I chanted passages like this one to myself over and over, under my breath, smiling. (I still have the battered Penguin paperback of Browne’s selected works that I bought in 1979; more lines in it are underlined than left unmarked.) It was okay for Browne to be so outrageous; he lived long ago, when people didn’t know any better. But from my first encounter with Browne I saw that he — a doctor and naturalist: the book from which I quote begins with the description of some burial urns discovered near Browne’s home in Norfolk in the 1650s — was Eiseley’s master. How Browne wrote is how Eiseley would have written had he lived in more propitious times.

As I consider it today, the discovery of Eiseley feels like the most important one of my adolescent reading. But when I first read him, I had not yet been forced to confront my mathematical limitations, so my chief interests continued to be scientific; Eiseley’s work mattered to me because it made the natural world, and the scientific discovery of that world, dynamic and vigorous in my mind’s eye. I knew nothing of what people call literature or belles lettres, so I did not realize that I was also getting an education in prose style — in the very idea that to write beautifully was a meaningful goal, an achievement worthy of serious pursuit for its own sake. I did not realize that Eiseley was preparing me to appreciate Browne: few if any of my classmates shared my enthusiasm for Religio Medici and Hydriotaphia, or Urn Burial. Nor did I realize that he was preparing me for William Faulkner.

When I was sixteen and about to graduate from high school — I finished a little younger than most people, thanks largely to my head start in youthful reading — I learned that a new shopping mall was about to be built just a couple of miles from my house in the East Lake neighborhood of Birmingham, Alabama. The mall was to be called Century Plaza, one of its stores was to be called B. Dalton, Bookseller. How romantic that sounded to me when I saw a notice in the classifieds of the Birmingham News that the bookstore’s manager was looking for staff. I interviewed, I was hired, and in the summer between high school and college I started selling books.

One of my co-workers was a man about a decade older than me named Michael Swindle. Michael and I ended up becoming good friends, and I learned a great deal from him, some of it edifying, some not so much; most of the lessons he taught me involved legal activities. But those are stories for another day. Right now I just want to emphasize one debt I owe to Michael. Michael understood my reading preferences, and politely declined when I tried to push really cool SF novels on him; he took his time, listening more than he talked, and then one day he held out a book and told me I ought to try it. It was called The Unvanquished, and it was written by William Faulkner.

It wasn’t until some years later, and much fuller acquaintance with Faulkner’s work, that I understood the shrewdness of Michael’s choice. He didn’t throw me into the deep end of the pool, with The Sound and the Fury or even As I Lay Dying; instead, he gave me one of Faulkner’s more accessible books, a series of linked stories written in relatively straightforward prose, culminating in that small masterpiece, “An Odor of Verbena.”

I was, pretty much immediately, transfixed. I am not sure now how aware I was of the stylistic kinship with Eiseley: they aren’t literary brothers, to be sure, but there’s a real consanguinity, and Browne is the ancestor of both. Anyone who likes how any two of those men write is probably going to like the third as well. What I do remember very clearly is being stunned — and “stunned” is not too strong a word — that the South that I had grown up in and that I had always thought of as a dingy, shabby place could be taken so seriously, written about with such dignity, considered as worthy of the closest attention and most finely-wrought rhetoric. Faulkner was quite an eye-opener.

This first encounter with him — which led eventually, though over a period of years, to my devouring almost his whole body of work — happened probably in my sophomore year of college, at the University of Alabama at Birmingham. (Given my parents’ indifference, I had to pay for my own schooling and didn’t know that student loans existed, so during school terms I worked 24 hours a week in the bookstore, and full-time during breaks. I asked for financial help once: my father said that I should just be grateful that they were letting me live in their house. I was seventeen at the time.) In other words, my discovery of Faulkner was more or less simultaneous with the realization that I wasn’t going to be an astronomer. For the first time in my life I seriously considered the possibility that literature was going to be my thing.

It did become my thing. I transferred to what we thought of as the University of Alabama, the one in Tuscaloosa, largely because it had a better English department. I majored in both English and history, and at some point decided — what considerations went into the decision I no longer remember — that I wanted to go to graduate school to study more literature. So I attended the University of Virginia. I developed a historical sense — my love for Browne’s prose led me to spend most of my time in the seventeenth century, until a relatively late encounter with the poetry of W. H. Auden made a modernist of me — amassed a repertoire of critical gestures, learned to invoke the names and terms of High Theory in the proper ways and at the proper times. I was initiated into the academic guild; I became a professor.

It wasn’t always easy, of course. In my last weeks as an undergraduate one of my professors had taken me aside and whispered to me the sacred names of Barthes and Derrida, and told me I should make fuller acquaintance with them. I dutifully wrote down the names and immediately forgot about them. Since none of this Theory stuff had previously been mentioned to me in my undergraduate career, how important could it be? So when I plunged into my first graduate classes — including a theoretical survey in which we read Marx, Nietzsche, Freud, Jung, Gramsci, Georg Lukács, Horkheimer and Adorno, Husserl, Heidegger, Ricoeur, Jakobson, Althusser, Brooks, Frye, de Beauvoir, Kenneth Burke, and, yes, Barthes and Derrida, among others — I was immediately transformed from a confident critic-in-the-making to a lost lamb, baahing petulantly.

Ten weeks or so into my first semester I decided that I just couldn’t cut it and needed to drop out. But I was a newlywed, and had carried my bride hundreds of miles from her family, set her down in a strange town, and effectively forced her to hunt for comparatively menial jobs, all to support this great academic endeavor of mine. I couldn’t bring myself to tell her how miserable and incompetent and just plain lost I was.

Our apartment in Charlottesville had a small windowless room that I used for a study. One evening after dinner I went in and closed the door and tried to sort through the vast pile of photocopied theoretical essays I had bought at Kinko’s on the first day of class. (We could violate copyright in those days, too.) But it was useless. I could scarcely bear even to look at the stuff. My professor had copied from his own well-used books, and every essay was full of confident underlinings and annotations that seemed by their very presence to judge me and find me wanting. I couldn’t bring myself to read another word.

My eyes wandered to a nearby bookshelf, and were caught for a moment by the glint of a gold cardboard box: it contained the three volumes of the Ballantine mass-market version of The Lord of the Rings, plus The HobbitCLP0279 I had never read Tolkien: I was a science-fiction guy, not a fantasy guy. But of course I knew that The Trilogy (as I thought of it) was important, and that someday I ought to get to it. Almost thoughtlessly, I picked up The Fellowship of the Ring and began to read.

When bedtime rolled around I set the book down and emerged from the sanctuary. “How’d it go tonight?” Teri asked.

I said, “It went well.”

The next evening I re-entered the study, under the pretense of continuing my academic labors with all due seriousness, and picked up where I had left off in the story. For the next week or so, though during the days I went to classes and did generally what I was supposed to do, I did none of the reading or writing I was assigned. I got further and further behind. I didn’t care; I was somewhere else and glad to be somewhere else. Teri seemed pleased with my scholarly discipline, as each evening I washed the dishes, gave her a kiss, and closed the study door behind me.

When I finished The Lord of the Rings I drew a deep breath. I felt more sound and whole than I had felt in weeks, maybe months. But, to my own surprise, I did not conclude that all that academic crap was a waste of time and I should do something else with my life, something that gave me time to read lots of fantasy novels. Instead, I experienced a strange refreshment, almost an exhilaration. My confusion and frustration seemed like small afflictions, conquerable adversaries. Barthes and Derrida weren’t so fearsome after all. I could do this.

I don’t believe that I was thinking, “Literary theory is as nothing in comparison to the power of Mordor!” Or, “If Frodo can carry that Ring to the Cracks of Doom I can write this paper on Paul Ricoeur!” Rather, I was just benefiting from spending some time away from my anxieties. We had been too intimate and needed separation. So I resumed my studies in a far better frame of mind; as a result, I did better work. I eventually completed my doctorate and began my career as a teacher, but I didn’t forget the debt I owed to that week I spent in Tolkien’s world.

I am going to skip over the next couple of decades, mostly, in order to get to the story I am primarily concerned to tell. During that period I learned a lot and grew as a teacher and a scholar. Much of my reading was in preparation for teaching; most of the rest was related to the articles and, later, books I was writing. I rarely thought that I was being deprived of the opportunity to read leisurely, because I generally taught and wrote about things that I loved. I wrote more about Auden than anything else; I spent a lot of time studying theology and then trying to write about the intersection of theology and literature. In light of my miserable first semester of graduate school, it may seem surprising that I came to teach and write about literary theory — surprising, but I hope also encouraging to those who struggle with difficult subjects. (Sometimes the things you struggle with the most can become among the more interesting for you to think about.) I spent much of a summer in Africa — central Nigeria — and devoted a good bit of my attention to African literature.

Then, about a dozen years ago, I developed a new interest that seemed minor at the time, a mere sideline, but ended up changing my life in some fairly important ways. I got interested in Linux.

I am not sure how this happened. I had bought the first Macintosh in the spring of 1985 in order to write my dissertation — taking out a hefty loan to do so, since the machine cost about 15% of my annual salary as an instructor — and had been a faithful Mac user ever since; but not really a power user, and certainly not a coder. But in the late 90s I suspected that Apple might be coming to the end of its road, and I knew that I didn’t want to use Windows; in any event, for the first time I started exploring alternatives to the Mac, and Linux was the one I decided I wanted to learn. (I had no way to know that Steve Jobs’s return to Apple, which happened about this time, would lead to a new operating system for the Macintosh based on Unix. If I had known I probably wouldn’t have looked twice at Linux; but as it turned out, my Linux inquiries taught me to do some stuff from the command line that I never would have encountered otherwise and that worked in the same way on OS X as on Linux. So the Linux detour was worthwhile even though I never ended up abandoning the Mac.)

Linux is still not ideally user-friendly — though the Ubuntu distributions are coming closer and closer to what they need to be — but in those days it was quite rough around the edges and hardly sanded smooth in the middle. Depending on your modem or local network, you stood a chance of being able to plug in an Ethernet cord and get online without impediment, but wireless access and printing were fraught with complexities, and getting photos from a camera onto your Linux box not for the faint of heart. It was scarcely possible to use the machine full-time without learning how to edit configuration files. I determined to master as much of these skills as I could — though in truth “master” is far too string a word: I determined to type in commands that manuals and websites told me I should type in. And then later, perhaps, figure out what the commands meant.

I’ve told this story in some detail in some essays I published in Books & Culture, and later reprinted in my collection Shaming the Devil; I don’t want or need to go over it again here. I just want to emphasize that my forays into the world of Linux drew me back towards some of my earlier habits as a reader — as a thinker, as a person. When I entered the geeky cosmos I was reminded of how many geek-like traits I once had: I had liked numbers, I had liked solving problems and puzzles, I had wanted to know how things worked, I had been very interested in almost all the sciences, I had read little but science fiction. Wading through the copious posts of Linux aficionados, looking for very specific answers to very specific questions about matters like how and when to run fsck, I couldn’t help noticing the jokes, the signatures with real or bogus quotations from Tolkien, the ridiculous avatars … these were in some weird way my people, in the sense of that phrase I grew up with in the South: they’re from where I’m from, they’re built out of the same blocks. Over the years I had moved far away from them just as I had moved away from the South; my accent had changed, as had my tastes and preferences; but I discovered that I still recognized my origins. That kid with the Golden Book of Astronomy, a cheap telescope, and a geology set? Yeah, that was me. That is me.

So began the process of reverting to type — of reclaiming an identity that I had misplaced over the years, that I had had to misplace, perhaps, or so I unconsciously thought, in order to succeed at the odd vocation of Professor of Literature.

The fine English writer of detective stories Reginald Hill once wrote:

I still recall with delight as a teen-ager making the earth-shaking discovery that many of the great ‘serious novelists,’ classical and modern, were as entertaining and interesting as the crime-writers I already loved. But it took another decade of maturation to reverse the equation and understand that many of the crime writers I had decided to grow out of were still as interesting and entertaining as the ‘serious novelists’ I now revered.

As I reverted to my original type, I didn’t lose any affection for or interest in the great writers I had been teaching for so many years. Least of all did I turn away from Auden, who was a geeky kid himself, obsessed with the machinery of lead mining and knowledgeable about metallurgy, who grew up to be that rare thing, a scientifically literate poet. (He even wrote some poems about articles he read in Scientific American, to which he subscribed.) Homer remained a great joy to me, as did Dostoevsky, Virginia Woolf, Wole Soyinka, and all the rest I continued to share with my students. I just managed to re-open a few doors I had inadvertently closed.

Some years ago the novelist Neal Stephenson answered a bunch of questions from Slashdot readers and in the process told this story:

A while back, I went to a writers’ conference. I was making chitchat with another writer, a critically acclaimed literary novelist who taught at a university. She had never heard of me. After we’d exchanged a bit of of small talk, she asked me “And where do you teach?” just as naturally as one Slashdotter would ask another “And which distro do you use?”

I was taken aback. “I don’t teach anywhere,” I said.

Her turn to be taken aback. “Then what do you do?”

“I’m … a writer,” I said. Which admittedly was a stupid thing to say, since she already knew that.

“Yes, but what do you do?”

I couldn’t think of how to answer the question — I’d already answered it!

“You can’t make a living out of being a writer, so how do you make money?” she tried.

“From … being a writer,” I stammered.

At this point she finally got it, and her whole affect changed. She wasn’t snobbish about it. But it was obvious that, in her mind, the sort of writer who actually made a living from it was an entirely different creature from the sort she generally associated with.

“Because she’d never heard of me,” he later comments, “she made the quite reasonable assumption that I was … so new or obscure that she’d never seen me mentioned in a journal of literary criticism, and never bumped into me at a conference. Therefore, I couldn’t be making any money at it. Therefore, I was most likely teaching somewhere. All perfectly logical. In order to set her straight, I had to let her know that the reason she’d never heard of me was because I was famous.”

It wasn’t inevitable that the writer with whom Stephenson had this conversation would be female, but it wasn’t unlikely either. These are delicate matters, but I think it’s fair to say that one of the beliefs I had unwittingly, or half-wittingly, acquired over the years was that writers like Stephenson are boys’ writers. They write about things that most women aren’t interested in, and that adult men — or rather, highly educated adult men — or rather, adult men highly educated in fields other than mathematics and the sciences — aren’t especially drawn to either. To become a mature adult, I had come to think, is to realize (as Tolkien failed to) that there aren’t any significant female characters in The Lord of the Rings, and that that’s a problem, and that such fictional situations should be avoided. To become a mature adult is to realize (as Neal Stephenson failed to) that a novel whose momentum is driven by gadgetry and badassery rather than nuanced character development is … unsophisticated at best, and immature in a distinctively boyish way at worst.

Such were the thoughts, or pre-thoughts, or gauzy intuitions, that had been going through my mind for some time. They may not be defensible; they may be as insulting to female and “mature adult male” readers as they are to Tolkien and Stephenson. But it seems to me that a great many people think something of the kind, as may be evidenced by the fact that only quite recently have Neal Stephenson’s novels gotten anything like the same kind of attention in newspaper and magazine book review sections as the novels of Joyce Carol Oates. (And of course, novels that are perceived as having a largely or exclusively female audience don’t often get reviewed in such venues either — indeed have more ground to make up than writers like Stephenson and Lev Grossman and Michael Chabon, who in their varying ways cross over between “literary fiction” and “genre fiction.”) (In general I dislike using scare quotes, but if they’re ever appropriate they’re appropriate in that previous sentence.)

There is of course a pile of critical commentary on these matters. Chabon has in recent years written several essays defending and celebrating genre fiction; P. D. James added a postscript to her Death in Holy Orders arguing (against her predecessor Dorothy Sayers) that detective stories can be fully literary; the Reginald Hill quotation I cited comes from a similar apologetic. Ursula K. LeGuin has written, eloquently,

To conflate fantasy with immaturity is a rather sizeable error. Rational yet non-intellectual, moral yet inexplicit, symbolic not allegorical, fantasy is not primitive but primary. Many of its great texts are poetry, and its prose often approaches poetry in density of implication and imagery. The fantastic, the marvellous, the impossible rode the mainstream of literature from the epics and romances of the Middle Ages through Ariosto and Tasso and their imitators, to Rabelais and Spenser and beyond.

And Iain Banks once slyly pointed out that he doesn’t mind talking about “genre fiction” as long as we acknowledge that literary fiction is one of the genres — or was it Iain M. Banks who pointed that out? As Iain, he wrote novels that are regularly called literary; as Iain M., novels that get pegged as SF. Which indicates that the distinction was a live one in his own mind, though I doubt that he made it in order to denigrate either kind of book. (I believe Banks wavered, once or twice, about whether to use the initial, much as Kierkegaard used to agonize about when to use a pseudonym or which one was appropriate.)

As I have returned to the regular reading of science fiction and mystery novels — I haven’t, so far, been drawn back to the Western, though one of my favorite books of criticism is Jane Tompkins’s West of Everything — I have occasionally tried to figure out whether I’m reading the books in a fundamentally different way than I read the kinds of books I teach. I believe the answer is Yes — though I will immediately add that I don’t recommend reflecting on how you’re reading while you’re reading. Some experiences (you know what they are) are better done unreflectively. The mind can turn on itself in some pretty funky ways.

That said, yes, there are differences. I am tempted to say that literary fiction is fiction you can annotate as you read without compromising your understanding — indeed, through annotation you often read better, more deeply — while you aren’t likely to want to underline very often in a detective story or a space opera and probably shouldn’t. And should you annotate a Neal Stephenson book, you’ll be doing so for very different reasons than the ones that motivate you to underline passages in a Joyce Carol Oates book.

A world of other differences are comprised within this single practice of manual responsiveness. We can understand a good deal about reading by considering not what our eyes do but rather how our hands are occupied. The reader who uses her hands only to turn pages is engaged in a significantly different activity than the one who holds a pencil; and that reader does something subtly (but significantly) different than the one who holds a highlighter. Similarly, I am convinced that one reason e-readers are helpful to the cause of reading is that they give people addicted to texting something to do with their thumbs while they read.

Because I am a teacher and scholar, I do much of my reading while thinking about what I am going to do with that reading. I consider which passages I should quote in an article or book, or what elements of the text to call my students’ attention to. When reading about some event on page 224 I remember a conversation about a hundred pages back and pause to look it up; and then, if necessary, to cross-link the two passages. This has become so habitual to me that, more than once, I have found myself reading a Harry Potter book or a Nero Wolfe mystery with a pencil in hand. When that happens I toss the instrument aside in annoyance and then try to stop thinking about how empty my hand is.

In many respects, going back to the kinds of books I used to read has also meant going back to the kinds of reading habits I used to have. Just as there was a point in my life when I had to remind myself to grab that pencil, the time eventually came when I had to remind myself to leave it where it was and grasp the book (or the Kindle) in my two otherwise empty hands. The object now was not to prepare for class or develop a scholarly argument, but rather to become lost in a book, as I once was often; to be self-forgetful for a while. Indeed, I wonder whether it’s significant that my reversion to type started happening smack in the middle of middle age, in a period of life when the world is almost always too much with us, when time alone is is rare and, let’s face it, rarely seized — especially by people with smartphones.

Such people — a category I belong to — are the intended audience of my book The Pleasures of Reading in an Age of Distraction. I wrote that book largely to encourage people who used to take great pleasure in reading but feel that they can’t any longer and are tempted to despair about the whole business. I won’t rehash the argument of that book here, or repeat its account of how I managed to overcome, mostly, my own distractions. I just want to emphasize the that problem of distraction came home to me as I was reading more and more of the kinds of books I read when I was young: books that ask for long periods of undivided attention. “Page-turners,” we sometimes call them, which means that when you get to the end of a page you should turn it rather than pause to check your phone for messages. Scholarly, critical, pedagogical reading suffers less from an age of distraction than what I call “rapt” reading, “lost-in-a-book” reading, because the scholarly reader is pausing often anyway to underline and annotate. I didn’t realize this until I resumed my interest in SF and mysteries.

But my reversion to type wasn’t just a matter of becoming once more a reader of genre fiction: it involved also a recovery of my fascination with science and mathematics. The two aren’t unrelated, of course. The reader of Stephenson’s Cryptonomicon learns a lot about cryptography, analog computing, and various implications of the principles of hydraulics. Kim Stanley Robinson’s Mars trilogy teaches you a lot about why there could actually be elevators that reach into space and how one might go about taking a cold dry planet and turning it into a warm green one. These are the sorts of topics that send even a mildly curious person running to Wikipedia and from there to who knows what. (Here’s another case of synchronicity: I wonder whether I would have pursued my scientific inquiries so vigorously, or at all, if I hadn’t had Wikipedia to consult at the drop of every quantitative hat.)

I don’t want to overstress what I’ve learned: I remain mathematically hamstrung and scientifically semi-barbaric. But I know a lot more than I did a decade ago. I can write simple Perl and Python scripts; I can explain to children how refrigerators and computers work (“mostly” and “sort of,” respectively); I even staggered through David Berlinski’s A Tour of the Calculus, though that experience offered no encouragement but rather a renewal of my long-ago feelings of abject incompetence. Still, I read the book and did many of the exercises. Don’t ask me how successfully I did them.

I pursued these matters out of relatively pure enthusiasm, delight in stretching parts of my brain that hadn’t been used much in a few decades. But in devoting so much of my leisure reading to books by scientists, I ended up, quite inadvertently, changing my views about my own profession. Let me try to illustrate what I mean by quoting someone who takes a considerably more extreme view of these matters than I do, the science writer Timothy Ferris. A few years ago he wrote a piece for Wired called “The World of the Intellectual Vs. the World of the Engineer”; here’s an excerpt:

Many worthy thinkers fear that “the life of the mind” is being crowded out by the current explosion of scientific information and technological innovation. “We are living in an increasingly post-idea world,” warns Neal Gabler, in a New York Times op-ed essay mourning the loss of an era when “Marx pointed out the relationship between the means of production and our social and political systems [and] Freud taught us to explore our minds.”

But in what sense is this a loss? Freud discovered nothing and cured nobody. Marx was a hypocrite whose theories failed in about as hideously spectacular a way as can be imagined.

What is fading, it seems to me, is not the world of ideas but the celebration of big, pretentious ideas untethered to facts. That world has fallen out of favor because fact-starved ideas, when put into practice, produced indefensible amounts of human suffering, and because we today know a lot more facts than was the case back when a Freud could be ranked with an Einstein.

Perhaps not incidentally, in allowing that engineers, and by extension many scientists, aren’t intellectuals, Ferris is bowing to a usage that seems to have gotten started eight or so years ago. It was in the 1930s that the great mathematician G. H. Hardy was heard to ask, “Have you noticed how the word ‘intellectual’ is used nowadays? There seems to be a new definition that doesn’t include Rutherford or Eddington or Dirac or Adrian or me. It does seem rather odd.”

In any event, Ferris wants to make a near-absolute distinction between the fact-free, theory-obsessed world of the Intellectual and the fact-rich, theory-indifferent world of the Engineer. But the distinction won’t hold in so strict a form. The history of science is littered with examples of theories that outrun the available evidence; and neither Marx nor Freud was a pure theorist: each made empirical contributions to his field of concern.

Nevertheless, there’s something to what Ferris is saying. As I look back on the history of literary study for the past 75 years or so, it seems that the period from 1940 to 1980 — The Age of the Text, one might call it — was striking incurious about all that goes into the making and using of books and other written things. From the New Critics to the deconstructionists, there was a general willingness to take the physical existence of the text for granted and to focus all one’s attention on how it functions or does not function. From this point of view, what’s interesting about the rise of the New Historicism in the 1980s is its reawakened curiosity about a world of facts and things. People write books: who are those people? How were they educated? What did they do in their spare time? What did they wear? What did they see when they went to the theater? The rallying cry of the poet William Carlos Williams was “No ideas but in things,” and one might say that that was the message of the New Historicism as well.

And this line of thought leads, or ought to lead, to a renewal of textual criticism and bibliography as well. Books are things, as are broadsides, magazines, and newspapers. The material history of poems and stories is endlessly fascinating, and there is a great deal more to learn about it than we currently know. People have never received “the text” in some Platonically ideal form, but in material objects. (A Kindle is as material as a codex.) These more historical ways of thinking about literature seem non-literary to people who were educated in The Age of the Text, but to many of us they make literature more interesting by re-connecting it to a dense world of wildly varying people, facts, and objects.

My apologies for the digression into a potted history of my professional world, which non-professors will probably find boring and professors will surely find simplistic. (“You have failed utterly to indicate the epistemological aporias which the New Historicism inherits from deconstruction!”) But what it suggests — and what lots of other evidence suggests — is a general cultural move from the dominance of the Intellectual to the dominance of the Engineer. The icon of that earlier phase is Marshall McLuhan; the icon of the phase we’re in now is Steve Jobs. The guy who talks about ideas yields the stage to the guy who makes things. This cultural shift has both helped to prompt my recovery of old reading practices and has made that recovery far easier than it would be been in an earlier era.

Everyone seems to agree that being multilingual is a great boon to one’s thinking. But languages have always been one of my intellectual weak points. I can’t speak any language, other than English, with skill. I can read some German, some French, a little Italian, a little Latin. Over the past few years I have worked more on the Latin than anything else, largely for professional reasons, but it’s hard at my age to memorize all those conjugations and declensions. (I feel about the inflected character of of Latin much the way I feel about the elliptical orbits of the planets and moons.) I once read that a person’s ability to learn a new language starts to decline at six years old; so I’m rather past my peak performance now, and have little hope of making serious progress.

But I also have thought that I might be better served to focus my chief energies on learning the languages of science and mathematics — possibly not any better than I have learned Latin, since the part of my brain that flounders when confronted with case endings is flummoxed by quadratic equations and peptide sequences. But now that I’m in my fifties I can’t pretend that I am going to be able to do very many of the things worth doing that I haven’t already done; I have to be selective. And it seems to me that there is a strong case to be made that attempting to bridge the cultural gap that separates English speakers from French speakers is no more valuable than confronting the chasm that separates the Two Cultures of the sciences and the humanities.

A great many people hear the phrase “The Two Cultures” and think C. P. Snow — rightly enough, of course, because he coined the phrase. But few have read Snow’s whole argument, which was originally delivered as a lecture at Cambridge University and then published as a very small book called The Two Cultures and the Scientific Revolution. (I have an old hardcover copy that weighs in at 58 pages — it would make a great Kindle Single.) It turns out that, once he identifies the two cultures, Snow isn’t very interested in how they differ or how they might be brought together, though he was the person to whom G. H. Hardy made his comment about being excluded from the ranks of intellectuals. Snow’s concerns are purely social and economic. He laments the scientific and technological ignorance of the “traditional culture” — by which he means the people who run the major academic institutions of Great Britain — because he believes that that ignorance is causing the country to fall farther behind in a world where economic success and political power will increasingly be achieved via scientific expertise and technological innovation. If he thinks that the division of the educated world into two cultures is otherwise harmful, he doesn’t say so.

It is harmful, though, not least because it trains people to mutually reinforcing bad habits: simultaneously (a) to sneer when outsiders to their own discipline “get it wrong” and (b) to dismiss the work of the other culture as insignificant. It seems to me that my own professional world, that of the humanities, and of literary study in particular, is especially guilty of the latter. I wonder how many English professors who don’t know the laws of thermodynamics or Newton’s laws of motion have nevertheless gone to the trouble of reading and quoting bits from philosophers of science — Thomas Kuhn, Paul Feyerabend — who seem, they think, to give them plausible justifications for their ignorance. (I’ll omit here mention of the Sokal Affair, because I think that miserable little fiasco is less telling than it is usually taken to be. The general attitudes are more troubling.)

Scientists, of course, can be equally dismissive of the humanities as not producing real knowledge, and can be remarkably harsh when humanists or artists have anything to say about science. Richard Feynman absolutely despised Auden’s poem “After Reading a Child’s Guide to Modern Physics” — though I don’t think he understood it, and he despised poetry and the arts in general — and mathematicians have, in some cases gleefully and in some cases ruefully, pointed out that David Foster Wallace’s Everything and More: A Compact History of Infinity is full of errors. In general, I think the rueful ones take the better approach, though it’s perfectly understandable that people who really understand the mathematics of infinity would be frustrated by the jaunty assurance with which DFW perpetrates his errors. Interestingly, one of the harshest reviews of the book came from a mathematician-novelist, Rudy Rucker, who called the book a “wretched farrago.” Jim Holt, by contrast, acknowledges the errors but is willing to live with them because he hopes that DFW’s evident enthusiasm for mathematics will inspire readers. “What it offers, in the end, is a purely literary experience.”

So there is a genuine problem when writers pretend to greater scientific and literary expertise than they actually possess. (The same is true for readers: the fifteen Stephen Jay Gould and half-a-dozen Richard Dawkins books I’ve read don’t entitle me to pontificate about evolutionary biology.) But the proper response to this problem is not for Intellectuals to shun, or flee in fear from, the realm of the Engineers; the proper response is for those of us who are interested in closing those gaps to make efforts in that direction. Why?

There are several reasons. One of them is anthropological — that is to say, in a quite serious sense is concerned with the differences among intellectual cultures. Let’s return to Auden and Feynman for an illustration. Not only in “After Reading a Child’s Guide to Modern Physics” but in several other poems, Auden meditates on the increasing dominance of scientists and technologists: sometimes with ironic detachment and complexity of feeling, as in “Under Which Lyre” (1946), but increasingly, as he got older, with suspicion and resentment, as in “Moon Landing” (1969). It is easy to get caught up in those suspicions, and to identify with the poet when he identifies himself with the worshippers of “precocious Hermes” rather than the acolytes of “pompous Apollo” — even, or especially, when he admits that the Hermetics shouldn’t be put in charge of anything serious: “The earth would soon, did Hermes run it, / Be like the Balkans.” (We’re too subtly thoughtful to rule!)

All the more reason, then, to read something like James Gleick’s Genius: The Life and Science of Richard Feynman and learn just how varied and complex were the responses made by the scientists at Los Alamos to their own work. Or — even more important — discover this:

Feynman, who had once astonished the Princeton admissions committee with his low scores in every subject but physics and mathematics, did believe in the primacy of science among all the spheres of knowledge. He would not concede that poetry or painting or religion could reach a different kind of truth. The very idea of different, equally valid versions of truth struck him as a modern form of cant, another misunderstanding of uncertainty.

Feynman’s view poses a direct and vital challenge to the belief, expressed by Auden in his Apollo/Hermes distinction and in different language by Stephen Jay Gould in his idea of “non-overlapping magisteria,” that knowledge is not one, that truthfulness is always culturally relative — even when the cultures involved are produced not by language, class, or nationality but by education.

I am not saying that Feynman is right and Auden and Gould wrong; I don’t know whether Feynman is right or not; I am just saying that I needed to hear and consider his argument, and was unlikely to hear it from a poet. Gould’s NOMA idea has a prima facie appeal, but that may not be because it’s intrinsically probable but because it offers an easy way to avoid uncomfortable arguments that humanists are likely to lose. And again: this is a matter of understanding different cultures, not a matter of knowing anything substantive that Feynman knew. Gleick’s book is a work of historical biography, not a science text. At this point in my life I am not likely ever to understand Feynman’s key ideas, which certainly limits my ability to assess his beliefs about the unitary character of knowledge — limits it, but does not eradicate it. I can still think; I can still learn.

The powerfully mathematical character of modern physics — which the physicist Lee Smolin has claimed “favors virtuosity in calculating over reflection on hard conceptual problems” — means that I have zero chance of understanding anything meaningful about string theory or other recent developments. Another anecdote from Gleick: “A physicist once asked [Feynman] to explain in simple terms a standard item of the dogma, why spin-one-half particles obey Fermi-Dirac statistics. Feynman promised to prepare a freshman lecture on it. For once, he failed. ‘I couldn’t reduce it to the freshman level,’ he said a few days later, and added, ‘That means we really don’t understand it.’” Maybe. Or maybe there are some vital truths about the universe that can’t be explained at the freshman level, or the middle-aged-English-professor level. If so, that in itself is worth knowing.

In other fields, especially the biological sciences, I think I can do a little better. For me it has been a great intellectual exercise to read Dawkins’s less polemical books, the ones with limited excursions on his anti-religion hobby-horse, especially Climbing Mount Improbable and The Greatest Show on Earth. This is because Dawkins is really, really good at explaining how evidence for certain theories — or “theorums,” as he has awkwardly chosen to call them — builds over time. In an age of unprecedentedly excellent science writers, I think he does this better than anyone else. How do you climb Mount Improbable? — One step at a time, over millions of years, watching as incremental genetic changes close off certain options while opening up news ones that have tiny immediate effects which prove immensely powerful in the long run. The very long run, longer than we can easily imagine.

I have similar thoughts about Danny Hillis’s wonderful little book The Pattern On The Stone: The Simple Ideas That Make Computers Work, which was the book that made me believe I could learn at least a little scripting, if not full-fledged programming. Like Dawkins’s account of how evolutionary biology builds up reliable historical evidence, Hillis’s account of computer design explains how the simplest elements — Boolean logic applied to the most elementary distinction of all, that between on and off, yes and no, 1 and 0 — ultimately yields the staggering power of modern computers and their networks. How cool to think about. How useful to internalize these procedures — or at least to attempt to — in order to structure and buttress one’s own thinking.

I don’t want to overstress what I learn through this kind of reading. I do a lot of it, but it remains half-assed, inconsistent, completely unsystematic. Vast tracts of human knowledge remain blanks to me. But it’s important to emphasize that the best writers about science really do communicate valuable information to a lay readership, and pursue a deeply worthwhile calling. As Stephen Jay Gould wrote in the preface to his collection of essays Bully for Brontosaurus,

I deeply deplore the equation of popular writing with pap and distortion for two main reasons. First, such a designation imposes a crushing professional burden on scientists (particularly young scientists without tenure) who might like to try their hand at this expansive style. Second, it denigrates the intelligence of millions of Americans eager for intellectual stimulation without patronization. If we writers assume a crushing mean of mediocrity and incomprehension, then not only do we have contempt for our neighbors, but we also extinguish the light of excellence. The “perceptive and intelligent” layperson is no myth. They exist in millions — a low percentage of Americans perhaps, but a high absolute number with influence beyond their proportion in the population.

Well said — and importantly said. Academics who disdain this kind of writing either haven’t tried it, or have tried it and have discovered just how immensely difficult it is to do well. Writing for the two dozen or so people who share your sub-specialty is strategically and tactically simple by comparison, though clearly necessary for the advancement of knowledge. It would just be nice if more top-flight scholars thought it valuable to share with the public the knowledge so advanced.

A late digression: I don’t want to get all inside baseball here, but one of the oddest and least expected consequences of my little explorations in science and computing has involved my professional work. In recent years I have for the first time become involved with an activity that used to be at the heart of humanistic learning: textual editing. Earlier this year Princeton University Press published my critical edition of Auden’s long poem The Age of Anxiety, and I am now working on an edition of his “Christmas Oratorio” from a few years earlier, For the Time Being. At one time I thought of textual editing as the proper province of old Professor Dryasdust. I was ignorant. Turns out that it’s a great deal of fun, and challenging in ways that exercise different parts of the mind than the parts used to generate interpretations of texts. You have to learn to read a person’s handwriting, which means paying attention to how he forms letters when he makes them properly so you have a better chance of deciphering squiggles; you have to learn that even a person’s typewriting may have identifying quirks (Auden didn’t add a space after a comma). You have to develop a rather complex set of skills for comparing drafts in order to understand their sequence. You have to be disciplined, organized, methodical.

As a textual critic you’re dealing with things: autograph letters, typewritten drafts, publishers’ proofs, printed editions with eccentricities or errors. Textual criticism is highly empirical, and in that sense makes demands on the scholar quite different from the demands made upon the person interpreting a text whose accuracy and stability he or she takes for granted.

This is to say that the world of textual criticism is, in comparison to textual interpretation, several points on the dial closer to the empirical and evidentiary world of the sciences — not so much concerned with Das Ding an sich as Das Ding in der Welt. That is perhaps why so many people in the digital humanities are also interested in the old-fashioned skills of textual criticism, bibliography, paleography, and typography. You get to deal with material stuff, and there’s something deeply reassuring and comforting about that, not in spite but because of the intractability of stuff. It does its own thing; it makes you adapt to it; it resists. When the world resists your attempts to manipulate it, you know it’s real; and that realization simultaneously strengthens your awareness of your own realness. It pulls you up short, in the best of ways. In one of his famous lectures, Feynman said,

If it disagrees with experiment it is wrong. In that simple statement is the key to science. It does not make any difference how beautiful your guess is. It does not make any difference how smart you are, who made the guess, or what his name is — if it disagrees with experiment it is wrong.

Feynman was no epistemological naif (few particle physicists have been, for the past century or so); he understood the vagaries of interpretation, the complications inevitably introduced into any endeavor when human beings are involved. But he had also been clearly and unavoidably wrong often enough to know the value of the experience. When you work with things, with stuff, you sometimes get confronted with this distinctive kind of wrongness. It’s bracing.

So ends the digression.

By way of conclusion, a few final thoughts. When my old friend Loren Eiseley responded to C. P. Snow a few years after The Two Cultures came out, he wrote,

It is because these two types of creation — the artistic and the scientific — have sprung from the same being and have their points of contact even in division, that I have the temerity to assert that, in a sense, the “two cultures” are an illusion, that they are a product of unreasoning fear, professionalism, and misunderstanding.

tumblr_m6an082YIa1qc1zmio1_500 For Eiseley, the two cultures are two modes of human creativity, imaginative responsiveness to the world, and he means to celebrate that responsiveness and encourage the rest of us to do the same — especially if our professional situations incline us to suspicion or resentment of others.

This could scarcely be more right, but even in Eiseley’s time the argument needed to be framed not in terms of two cultures but many; and this is far more true today. At your local university, in the biology department the population geneticist speaks a very different language than the conservation biologist; anthropologists can be quantitative or cultural; analytic philosophers and Continental ones often fail to be even on speaking terms. Nobody can get a secure grip on this nearly infinite variety of inquiry and vocabulary, but every attempt to read across the boundaries of one’s own preferred practices is a tonic and a stimulant. We tell ourselves that we don’t have time for this kind of reading, but given the multiple rewards, can we afford not to take that time? Often it’s confusing, sometimes it’s clarifying, usually — if you can find the good, clear writers in the various fields — it’s a great deal of fun.

I see I’ve moved here from narration to exhortation. But that’s because I have seen the great value of heeding my whims and allowing myself to read beyond my professional demands and even my long-established sense of self. My reversion to type has been above all else interesting, and has given me an increasingly broad (and I think more accurate) sense of what reading is, or can be, for.

And that lesson in breadth of purpose applies to fiction reading too. In my return to to science fiction and fantasy, I have been reminded that different kinds of stories accomplish different epistemological and ethical and hedonic goals. I thought of this as I was read Steven Poole’s review of Haruki Murakami’s 1Q84, which moves towards a meditation on Murakami’s immense popularity:

Murakami’s success speaks to a hunger for what he is doing that is unusual. Most characters in the modern commercial genre called “literary fiction” take for granted a certain unexamined metaphysics and worry exclusively about the higher-level complexities of circumstance and relationships. Throughout Murakami’s oeuvre, on the other hand, his characters never cease to express their bafflement about the nature of time, or change, or consciousness, or moral choice, or the simple fact of finding themselves alive, in this world or another. In this sense, Murakami’s heroes and heroines are all philosophers. It is natural, then, that his work should enchant younger readers, to whom the problems of being are still fresh, as well as others who never grew out of such puzzlements – that his books should seem an outstretched hand of sympathy to anyone who feels that they too have been tossed, without their permission, into a labyrinth.

Fiction is, among other things, an aid to reflection: a means by which we can more vividly and rigorously encounter the world and try to make sense of it. But we vary in our interpretative needs: the questions that absorb some of us never occur to others. Every genre of fiction puts certain questions in brackets, or takes their answers as given, in order to explore others. Not even the greatest of writers can keep all the balls in the air at once: some have to sit still on the ground while the others whirl. People who come to a book by Hurakami, or Neal Stephenson, or even Ursula K. LeGuin with the questions they would put to a Marilynne Robinson novel are bound to be disappointed and frustrated. But if we readers attend closely to the kinds of questions a book is asking, the questions it invites from us, then our experience will be more valuable. And the more questions we can put to the books we read — in the most generous and charitable spirit we can manage — the richer becomes our encounter not just with the books themselves but with the world they point to.

In a sense I am only talking here about expanding my repertoire of analogies, my ability to make illuminating and meaningful comparisons. For many years now Douglas Hofstadter, drawing on the work of the mathematician Stanislaw Ulam, has been convinced that the secret to creating artificial intelligence lies in teaching machines to recognize analogies. (Ulam says somewhere that it’s all about “as”: we see marks on a piece of wood pulp as a portrait of a beloved child, a cairn of stones as a monument to a dead chieftain.) Similar principles underlie the methods of Google Translate, which collects an enormous corpus of sentences and then tries to match your input to something in that corpus, and Apple’s  “digital personal assistant,” Siri. Siri can’t parse what you say to her unless she can connect to the network, which undertakes a comparison of your utterance to other utterances on record. All this might be called brute-force analogizing, but it seems to me that my own understanding develops as I pursue the same method, though with far less force and (I hope) less brutishness.

In one of his most beautiful poems, Richard Wilbur writes, “Odd that a thing is most itself when likened.” And this is true no matter the thing: a book becomes more fully itself when we see both how it resembles and how is differs from other books; one discipline of study takes on its proper hues only when we see its relations to other disciplines that stand close to it or very far away. My repertoire of analogies is my toolbox, or my console of instruments, by which I comprehend and navigate the world. It can’t be too large; every addition helps, at least a bit. And that’s why I’m thankful for my gradual recovery of the books I adored, and thoughts I lovingly entertained, when I was forty years younger.