By Patrick Brown
On September 23rd, the New York Times ran a David Wallace-Wells column called “Our Adaptation to Global Warming Is Largely Fictional” featuring “a new, eye-opening paper,” entitled “Are We Adapting to Climate Change?” The paper purports to show that there is limited evidence of human adaptation to climate change across a wide range of sectors, including human deaths, agricultural productivity, crime, conflict, economic output, and damages from flooding and tropical cyclones.
The paper supposedly counters the Pollyannaish view that “adaptation has been a hugely underappreciated success story.”
Count me as someone who thinks that “adaptation has been a hugely underappreciated success story.” However, this is not just some assumption that is “taken for granted” but is instead based on examining large-scale empirical trends in climate-sensitive outcomes. For example, as it has warmed over the past several decades, we have observed almost universally higher crop yields, translating to more calories available per person, and a reduction in death rates from malnutrition and famines. Access to safe drinking water has increased, while the prevalence of climate-sensitive diseases like malaria has decreased. Furthermore, mortality rates due to extreme temperatures—both cold and heat—have declined, as have deaths from natural disasters. Finally, we have seen precipitous declines in damage from natural disasters as a fraction of the value exposed.
So what’s going on? Are we adapting to climate change or not? The “new, eye-opening" paper argues we have not, but it does so in a misleading way.
First, let me offer what I think are intuitive, common-sense definitions of adapting to climate change:
Adaptation: Climate is changing, and we are adjusting fast enough so that outcomes stay the same over time.
No adaptation: Climate is changing, and we are not adjusting; thus, outcomes are getting worse over time.
Mal-adaptation: Climate is changing, and we are changing in ways that exacerbate climate change problems, worsening outcomes more than they would have been from climate change alone.
Super-adaptation: Climate is changing, and we are changing even faster (not necessarily because of climate change), so outcomes are improving over time.
One important nuance is that society's exposure to various climates and extreme weather hazards is also changing over time. After all, there are three times more people on the planet than there were in 1950. So, a population increase in hot parts of the world might mean more deaths from extreme heat simply because there are more people exposed to extreme heat, and that wouldn't necessarily imply mal-adaptation to climate change. Thus, studies on changes in climate-related outcomes over time should express outcomes in rates like “deaths per number of people exposed.” When studies define adaptation in this broad (I’d argue common sense) way, they overwhelmingly show super-adaptation. For example, empirical evidence of declining global vulnerability to climate-related hazards showed clear globally declining death rates and economic damage rates for floods, droughts, extreme temperatures, and extreme winds.
However, this broad definition of adaptation is not common in the academic literature, where adaptation tends to be narrowly defined as only those technologies or behaviors that are more beneficial in a changed climate than they are in an unchanged one. For example, according to some papers, if technology increases crop yields equally in the current and future climates, it should not be considered adaptation. Definitions like this allow for striking claims like the IPCC’s that adaptation is supposedly “insufficient to offset the negative impacts of climate change” for crop yields despite the fact that crop yields have been increasing steadily as the climate has warmed.
So is this narrow definition of climate adaptation the reason why the aforementioned study covered by the New York Times finds such minimal effect of adaptation?
Not quite. In the paper, they say that they take a broad view of adaptation, but are able to present a negative view of our adaptive capacities by focusing on the relative changes in adaptation within each decade:
“To quantify the net effect of these and any other adaptive actions that could have taken place, we estimate whether the sensitivity of a range of societal outcomes to a fixed change in climate has changed over time.”
So, in this case, it’s not an intentionally narrow definition that accounts for the small effectiveness of adaptation that they calculate. Instead, the apparent discrepancy comes down to the word sensitivity above, which is doing a tremendous amount of work. In the paper, sensitivity refers to the response of an outcome for a given climate fluctuation, but critically, it is relative to the decade average for that outcome.
Let’s illustrate with an example. The paper finds maladaptation in some outcomes, like Brazilian soybean yields. This may sound odd to people familiar with South American agriculture because Brazilian soybean yields have tripled since the 1960s as Brazil has warmed.
This has been driven by growing global demand stimulating productivity improvements and thus enhancing supply. Specifically, Brazilian soybean yields have benefited from improved crop varieties, including genetically modified versions that are more resistant to pests, diseases, and adverse weather conditions, fertilizer use, precise irrigation management, and the mechanization of farming. These technological advances have overwhelmed any negative impact from climate change, so how does the paper find mal-adaptation?
Well, in the 1970s, Brazil used to produce 1.5 tonnes/hectare of soybeans for growing seasons that were normal and 1.42 tonnes/hectare of soybeans for growing seasons that were 1°C hotter than normal (26°C) or a 5% decline.
By the 2010s, Brazil’s average productivity had doubled to 3.0 tonnes/hectare of soybeans for normal growing seasons but was reduced to 2.67 tonnes/hectare for growing seasons that were 1°C hotter year than normal (27°C).
So that's an 88% increase in output for a given +1°C variation! However, the paper’s methodology defines sensitivities as being relative to each decade. Since 11% is larger than 5%, it concludes that we have become less adapted to climate change over time.
The New York Times article writes that,
“On three-quarters of the impacts they studied, our vulnerability to warming hasn’t improved at all, meaning that a given climate event would be just as damaging as 50 years ago — perhaps more damaging.”
But this statement will mislead the vast majority of readers. The “more damaging” phrase is only true if it is interpreted as “more damaging relative to the constantly rising background level,” it’s not “more damaging” than 50 years ago in any absolute sense. The same is true for the other outcomes they study, like death rates from floods, and extreme temperatures or economic damage. When decreases in vulnerability emerge over decades, they are not allowed to count as adaptation and thus will be neglected by this study.
Investigating the question of whether relative sensitivities for adaptive capacity are changing over time is a fine exercise for an academic paper that explicitly acknowledges the long-term improvement in outcomes. But shifting the goal posts of what adaptation means to present an overly negative portrayal is something else. Further, the use of the word adaptation only serves to confuse because a definition of adaptation that is intuitive to most people would simply measure if we are becoming more or less vulnerable to given weather-related extremes compared to the past.
This common denigration of adaptation does no service to us because it misinforms the public and policymakers and diverts attention away from repeatable success stories. Adaptation to the climate is a function of decisions we collectively make and should remain a priority for humanity. The improvements in adaptation of the past decades are repeatable, but only if we understand them as success stories, and they are not denigrated to sell a narrative that our society is collapsing under the weight of climate change.
What you have outlined without actually saying it, is that the “climate change“ that we have experienced in this current Modern Warm Period, has been greatly beneficial. If one studies climate history, though, this should come as no surprise. Every previous one, the Minoan, Roman and Medieval Warm Periods to name the last three, have produced the same result. During the Medieval Warm Period for example, agricultural production exploded, and population increased 500% in Europe. Unfortunately, every previous warm period has ended in a cold snap. The Medieval Warm Period ended in the Little Ice Age, the Roman Warm Period ended in the Dark Ages Cold Period and the Minoan Warm Period ended in a cold period that contributed to the late bronze age collapse. So the moral of this story is be careful what you wish for. Warmer equals wetter equals more prosperity. Colder equals crop failures, famine, war and chaos. Enjoy the warmth while we have it. It won’t last forever.
I've got a different take on adaptation. For water resource folks, natural resource folks, farmers, and the allied academic disciplines.. climate is one of many things we are adapting to, and have been adapting to, as part of our normal way of doing our work. For example, droughts have always existed. We have ways of dealing with them. At what point does our work cross the line into being "climate" adaptation?
And in what sense is the anti-adaptation movement about keeping climate funding in certain disciplines' pockets (modelers, solar and wind) and out of ours (plant breeders, water resources and so on)?