There’s no way we could ever carry out any experiment to test for the multiverse’s existence in the world, because it’s not in our world. It’s an article of faith, and not a very secure one. What’s more likely: a potentially infinite number of useless parallel universes, or one perfectly ordinary God?
But when you divide the brain into bitty bits and make millions of calculations according to a bunch of inferences, there are abundant opportunities for error, particularly when you are relying on software to do much of the work. This was made glaringly apparent back in 2009, when a graduate student conducted an fM.R.I. scan of a dead salmon and found neural activity in its brain when it was shown photographs of humans in social situations. Again, it was a salmon. And it was dead.
An estimated 10 billion people will inhabit that warmer world. Some will become climate refugees—moving away from areas where unbearable temperatures are the norm and where rising water has claimed homes. In most cases, however, policy experts foresee relatively small movement within a country’s borders. Most people—and communities, cities and nations—will adapt in place. We have highlighted roughly a dozen hotspots where climate change will disrupt humanity’s living conditions and livelihoods, along with the strategies those communities are adopting to prepare for such a future.
— What Life Will Be Like on a Much Warmer Planet – Scientific American. This article + infographic from SciAm (paywalled, I think, and if so, sorry) is pretty good, but I’d love to see another post on places that will benefit from climate change.
Now, before I go any further: I think anthropogenic climate change is real and is going to be, overall, enormously destructive. I favor serious global governmental intervention to head off, if possible, the worst of it.
But it’s not going to be bad for everyone, and it would be fascinating to learn who will benefit and how. But few journalists or scientists want to tell those stories, for fear that they’ll reduce public concern.
Edward Wilson, an old master.
You need not see what someone is doing
to know if it is his vocation,
you have only to watch his eyes:
a cook mixing a sauce, a surgeon
making a primary incision,
a clerk completing a bill of lading,
wear the same rapt expression,
forgetting themselves in a function.
How beautiful it is,
that eye-on-the-object look.
— W. H. Auden, “Sext”
It follows also that this new vision of ‘natural theology’ is equally concerned, let me also state at the start here, to be flexible in a variety of ways for use in different contexts and genres, for different audiences, and by means of varying forms of communication. The term ‘apologetics’, unfortunately, tends to come with as much bag and baggage from the era of brash modernity as does its cousin ‘natural theology’; but its history of association with rationalistic brow-beating is one we need to live down. The art of giving a reasoned, philosophically- and scientifically-related, account of the ‘hope that is in us’ in a public space is a Christian duty, and it may take a great variety of forms. As discussed last time in relation to Nicholas Wolterstorff’s analysis of Thomas Aquinas’s own variety of uses for his own Five Ways, ‘natural theology’ must indeed at times be used apologetically and even polemically, when the occasion demands it: that is, if one is called to public debate in the university, in public political contestation, or in the press. There is a huge cultural interest in seeing theologians and philosophers of religion perform this undertaking in discussion with secular science, and we undermine our own credibility if we fail to take on this task with grace, clarity and humour.
But more often, I find, I am called to this task as a believer, as an academic, or as a priest, in quieter, less overt, but no less significant public contexts: in being asked intrigued questions about evolution and theology by the seeker who wanders into Ely cathedral looking for something, she knows not what; by the half-believer who wonders if science does indeed render Christianity invalid; by the generation of my children’s age for whom in so many cases the church has seemingly lost all intellectual and moral credibility; for those hoping to deepen their faith spiritually or make it more intellectually mature; and for the doubting amongst the faithful. The disposition, attentive prayerfulness and bodily grace with which these conversations must go on is especially crucial: this task is not about the soap-box, but it‟s not for the faint-hearted or defensive either. It has to be as philosophically and scientifically sophisticated as it is spiritually and theologically cogent; in short, it must not merely dazzle; it must more truly invite and allure.
Together with philosopher David Wasserman, Asch wrote in 2005 that using genetic tests to screen out a fetus with a known disability is evidence of pernicious “synecdoche.” Ordinarily, synecdoche is a value-neutral figure of speech, in which some single part stands for the whole—as in the common use of “White House” to stand for the executive branch of government. But Asch and Wasserman’s meaning was more loaded: prenatal genetic tests, they argued, too often let a single trait become the sole characteristic of a fetus, allowing it to “obscure or efface the whole.” In other words, genetic data, once known, generally become the only data in the room. Taking a “synecdochal approach” to prenatal testing, Asch and Wasserman warned—in the era just prior to consumer genetic sequencing—allows one fact about a potential child to “overwhelm and negate all other hoped-for attributes.”
We won’t know what Asch would have made of 23andMe, designer babies, or broader claims for personal genomics. But her intellectual legacy only grows more relevant in the era of ever-cheaper, personalized genetic data. Asch understood that there are plenty of things technologies like prenatal genetic testing can tell us. But the choices and challenges in defining a life worth living, and living well—it may be that these aren’t technological problems at all.
But what was the actual impact of coffeehouses on productivity, education and innovation? Rather than enemies of industry, coffeehouses were in fact crucibles of creativity, because of the way in which they facilitated the mixing of both people and ideas. Members of the Royal Society, England’s pioneering scientific society, frequently retired to coffeehouses to extend their discussions. Scientists often conducted experiments and gave lectures in coffeehouses, and because admission cost just a penny (the price of a single cup), coffeehouses were sometimes referred to as “penny universities.” It was a coffeehouse argument among several fellow scientists that spurred Isaac Newton to write his “Principia Mathematica,” one of the foundational works of modern science.
Coffeehouses were platforms for innovation in the world of business, too. Merchants used coffeehouses as meeting rooms, which gave rise to new companies and new business models. A London coffeehouse called Jonathan’s, where merchants kept particular tables at which they would transact their business, turned into the London Stock Exchange. Edward Lloyd’s coffeehouse, a popular meeting place for ship captains, shipowners and traders, became the famous insurance market Lloyd’s.
And the economist Adam Smith wrote much of his masterpiece “The Wealth of Nations” in the British Coffee House, a popular meeting place for Scottish intellectuals, among whom he circulated early drafts of his book for discussion.
Space colonies. That’s the latest thing you hear, from the heralds of the future. President Gingrich is going to set up a state on the moon. The Dutch company Mars One intends to establish a settlement on the Red Planet by 2023. We’re heading towards a “multi-planetary civilization,” says Elon Musk, the CEO of SpaceX. Our future lies in the stars, we’re even told.
As a species of megalomania, this is hard to top. As an image of technological salvation, it is more plausible than the one where we upload our brains onto our computers, surviving forever in a paradise of circuitry. But not a lot more plausible. The resources required to maintain a colony in space would be, well, astronomical. People would have to be kept alive, indefinitely, in incredibly inhospitable conditions. There may be planets with earthlike conditions, but the nearest ones we know about, as of now, are 20 light years away. That means that a round trip at 10 percent the speed of light, an inconceivable rate (it is several hundred times faster than anything we’ve yet achieved), would take 400 years.
But never mind the logistics. If we live long enough as a species, we might overcome them, or at least some of them: energy from fusion (which always seems to be about 50 years away) and so forth. Think about what life in a space colony would be like: a hermetically sealed, climate-controlled little nothing of a place. Refrigerated air, synthetic materials, and no exit. It would be like living in an airport. An airport in Antarctica. Forever. When I hear someone talking about space colonies, I think, that’s a person who has never studied the humanities. That’s a person who has never stopped to think about what it feels like to go through an average day—what life is about, what makes it worth living, what makes it endurable. A person blessed with a technological imagination and the absence of any other kind.
In English we speak about science in the singular, but both French and German wisely retain the plural. The enterprises that we lump together are remarkably various in their methods, and also in the extent of their successes. The achievements of molecular engineering or of measurements derived from quantum theory do not hold across all of biology, or chemistry, or even physics. Geophysicists struggle to arrive at precise predictions of the risks of earthquakes in particular localities and regions. The difficulties of intervention and prediction are even more vivid in the case of contemporary climate science: although it should be uncontroversial that the Earth’s mean temperature is increasing, and that the warming trend is caused by human activities, and that a lower bound for the rise in temperature by 2200 (even if immediate action is taken) is two degrees Celsius, and that the frequency of extreme weather events will continue to rise, climatology can still issue no accurate predictions about the full range of effects on the various regions of the world. Numerous factors influence the interaction of the modifications of climate with patterns of wind and weather, and this complicates enormously the prediction of which regions will suffer drought, which agricultural sites will be disrupted, what new patterns of disease transmission will emerge, and a lot of other potential consequences about which we might want advance knowledge. (The most successful sciences are those lucky enough to study systems that are relatively simple and orderly. James Clerk Maxwell rightly commented that Galileo would not have redirected the physics of motion if he had begun with turbulence rather than with free fall in a vacuum.)
The emphasis on generality inspires scientific imperialism, conjuring a vision of a completely unified future science, encapsulated in a “theory of everything.” Organisms are aggregates of cells, cells are dynamic molecular systems, the molecules are composed of atoms, which in their turn decompose into fermions and bosons (or maybe into quarks or even strings). From these facts it is tempting to infer that all phenomena—including human actions and interaction—can “in principle” be understood ultimately in the language of physics, although for the moment we might settle for biology or neuroscience. This is a great temptation. We should resist it. Even if a process is constituted by the movements of a large number of constituent parts, this does not mean that it can be adequately explained by tracing those motions
— The Trouble with Scientism. An incredibly important and provocative essay by Philip Kitcher.
“So we aren’t any closer to unification than we were in Einstein’s time?” the historian asked.
Feynman grew angry. “It’s a crazy question! … We’re certainly closer. We know more. And if there’s a finite amount to be known, we obviously must be closer to having the knowledge, okay? I don’t know how to make this into a sensible question…. It’s all so stupid. All these interviews are always so damned useless.”
He rose from his desk and walked out the door and down the corridor, drumming his knuckles along the wall. The writer heard him shout, just before he disappeared: “It’s goddamned useless to talk about these things! It’s a complete waste of time! The history of these things is nonsense! You’re trying to make something difficult and complicated out of something that’s simple and beautiful.”
Across the hall Murray Gell-Mann looked out of his office. “I see you’ve met Dick,” he said.