Philosophy and the philosophy of science supply us, as residents of a climate changed and climate changing world, with four key ideas to help us make the most of an increasingly terrible situation and to help us try to avoid contributing to an even more dreadful one in the future.
The four ideas are “Pascal’s Wager,” the law of unintended consequences, “The Collingridge Dilemma,” and “the precession of the simulacra.” Add to that the “precautionary principle,” and you have the beginnings of a reliable checklist to help you plan for the brave new world ahead. A full explanation of the first four ideas can be found in a 2013 book edited by John Brockman, “This Explains Everything,” a collection of some 200 short essays in which leading scientists and thinkers explain ideas that help them make the most sense of the world. The precautionary principle simply says that if you worry that the costs or potential harm of taking a risk (with a new technology let’s say) outweigh the potential benefits, don’t take the risk. The trick here is to be absolutely honest with yourself, which turns out to be an often insurmountable problem for many “corporate persons.”
In the wake of 1,000-year deluge from a climate-changed monster monsoon in downtown Santa Fe two weeks ago and an unnaturally warm winter and many weeks of 90 degree-plus weather in Albuquerque, with more ahead, many of us sense that we could well be forced to make major adjustments in how we live and how we make a living in the not-too-distant future. The specter of a heat wave approaching Albuquerque like the one that’s flattening Waco, TX and surrounds at 114 to 120 degrees is a sobering prospect and might even focus the attention of die-hard climate change deniers.
How does one survive climate change extremes productively, especially when the political world is only now emerging slowly from a cocoon of denial to face the realization that we are radically unprepared for what’s ahead? Part of the answer is to stop relying on the thought processes that got us into this sorry state in the first place. That’s where the four key ideas come in.
Pascal’s Wager refers to the famous logic of the philosopher and mathematician Blaise Pascal who around 1661 laid out a line of thinking by which any concept that couldn’t be proved empirically beyond the shadow of a doubt could be acted upon if the harm/benefit ratio tilted dramatically toward the good. Pascal was concerned with belief in God, who is empirically unknowable.
Pascal reasoned that a belief in God was rational because if there is a God and you believe there is, then you are in a win-win situation. If there’s isn’t a God and you believe there is, no harm comes to you. If there is a God and you don’t believe there is, then you are in a lose-lose situation, and a hellish one at that. So the reasonable person would operate as if there is a God, faithfully and with diligence, partly because even if God does not exist, you’ve probably made good changes in yourself and are living a morally happier and more contented life.
Pascal’s metaphor is a perfect one for “believing” in climate change. If climate change proves to be real, and you’ve believed it is, chances are that you have made adjustments to limit its impact and help you and your world deal with it productively. If there isn’t climate change, you’ve done no harm to yourself or your world by believing that there is, because you have probably strengthened and diversified your sources of energy (to the detriment of the fossil fuel industry perhaps) and are generally in better shape for whatever other kind of disaster comes your way. If there is climate change and you’ve denied it (presuming you’re in power), then you’re in the hellish situation that we’re all in now, about to be swamped by forces we have no control over even though we’ve created them.
Some readers, I’m sure, are thinking that the metaphor falls apart because of the obvious differences between believing in science and believing in God. True enough. But believing in science isn’t believing in some text that requires us to believe literally in its every pronouncement. Believing in science is believing in the validity of a system of human thought that is continually questioning and doubting the validity of its own conclusions, testing them constantly to see if they survive scrutiny and increase their believability. For me, believing in this kind of science and its foundational humility is a profoundly reasonable belief. It’s a belief in curiosity and the progressive energy of self-doubt and the will to understand — the exact opposite of blind faith and its equivalent in science-for-hire that claims to know the truth and spins it in favor of those who employ it.
Pascal’s Wager concludes that it is perfectly rational to believe in something that is overwhelmingly apparent, if not absolutely provable, because it helps us avoid being blindsided when what is apparent becomes undeniable reality and, as in the present case, dries us up, washes us away, blows our power out and hurls us back to the living conditions of the 18th century, conditions we are not prepared to cope with.
The law of unintended consequences is an idea we see working convincingly throughout our constantly changing, technologically empowered daily lives. The idea says simply that when you add new “things” to complex systems and situations, such as, say, the atmosphere being altered by massive amounts of CO2 or you buy a new cell phone with new tricks that somehow mess up older tricks, you are bound to face consequences you didn’t foresee. Of course, the fossil fuel industry as it grew in its early stages didn’t intend to contribute massively to climate change. But when its own company scientists laid down the holy writ that the relationship between burning fossil fuels and super heating was unreal, and almost treasonous, they lied through their teeth to protect their profits. In doing so they colluded in a disastrous unintended consequence for all of us, even themselves and their money. This is where the Collingridge Dilemma comes in.
In 1980, a British academic, David Collingbridge, saw a dilemma in technological evolution that only the precautionary principle could solve. Basically, it’s easy to change a technology — like that built around the use of fossil fuels — in its early stages, before you’ve sensed the potential for some nasty unintended consequences down the road, but almost impossible to change a technology when it has become embedded in a culture and its economy. As Collingridge put it, “When change is easy, the need for it cannot be foreseen; when the need for change is apparent, change has become expensive, difficult, and time consuming.”
Applying the precautionary principle when deciding to adopt or reject a new technology is fairly straightforward. If the technology belongs to a species of techniques that rely on hazardous fuels, materials and processes, one should take special precautions with the proposed new technology in question. A good case in point is nuclear energy. Why would fuels with extremely hazardous byproducts and waste be used to boil water when other almost benign processes and fuels are available? The answer is not rational or scientific; it’s political and economic, which are both dog-eat-dog and devil-take-all pursuits.
This leads us to the final key idea that comes from contemporary French philosopher Jean Baudrillard — what he calls the “precession of the simulacra.” In an essay in “This Explains Everything,” Douglas Rushkoff describes Baudrilliard’s meaning like this: “The main idea is that there’s the real world, there’s the maps we use to describe that world, and then there’s all this other activity that occurs on the map — sometimes with little regard for the territory it’s supposed to represent. There’s the real world, there’s the representation of the world, and there’s the mistaking of this simulation for reality.”
When it comes to climate change, we’ve watched as two maps, two representations, have gotten locked in a futile struggle. One map sees the world as it’s always been, based on wishful thinking and protection of investment. The other map sees a radically changing world, one with dire consequences, a new human-generated reality that has been painstakingly observed through the efforts of thousands of scientists with no vested interest in their conclusions.
Much like Pascal’s Wager, our leaders have chosen not to believe that this dire, though preventable change, was on the way. We made a mistake. We believed vested interests over dispassionate observers with nothing to gain. We chose the wrong map. We could have been in a win-win situation decades ago when technological change was relatively easy. But we didn’t choose reality. We abandoned even thinking about the precautionary principle. We chose instead the lose-lose scenario that looked profitable at the time, but that proves now to be a whirlpool of enormous costs that we may not be able to pay our way out of.
It seems likely that the only option left to us is to adapt. And how do we do that? Next week we’ll explore some possibilities.
*Nullius in verba: take nobody’s word for it