Faith-Based Resilience: What Are We Really Measuring?

How do we measure progress towards resilience? This single question may be the one I have heard most frequently over the past year, which I hope is a sign that many institutions are transitioning from resilience projects to resilience programs. I want to propose five widespread measures that probably do not reveal what we want them to tell us:

  • Efficiency. In many agricultural, energy, and industrial contexts, efficiency is king, especially water-use efficiency (e.g., crop-per-drop). But efficiency is not always resilience, and sometimes efficiency is even the opposite of resilience. Highly efficiency systems can fail suddenly and they often assume only one potential future. Example: The US decision to not stockpile N95 masks and ventilators before the Covid-19 pandemic was very efficient but not resilient—something I thought about a lot while looking for masks a year ago.

  • Access. Especially with the SDGs, we often measure poverty alleviation and health outcomes by looking at access, often measured as a snapshot. Access over time, such as during extreme weather events (when that access is most necessary) or as maintenance and repair are required might reveal quite different patterns. Reliability, endurance, and service delivery gaps may be good resilience measures.

  • Volume of money spent. Perhaps the most widespread measure of institutional commitment to climate issues, statistics on climate finance or gross volumes of funding tell us nothing about the quality of those investments. What did we really pay for? What is the actual “ROI”: Resilience Of Investment?

  • Return periods. Cloaked in the solidity of tradition, engineering, and science, return periods are actually the product of layers of policy decisions and assumptions, often made decades ago. A 1:100 return period in the UK could vary by decades, statistically speaking, in another country when evaluating the same data. Worse, these calculations are almost always based on stationary climate assumptions. Without a climate change context, return periods could encourage over- or underplanning. We don’t know if we don’t ask.

  • Cost-Benefit Analysis (CBA). Perhaps the single most important piece of math in the modern global economy, traditional CBAs discount uncertainty and interventions that require additional expense but that may have delayed returns. The “dismal science” of economics appears quite optimistic about climate change! Resilience has a cost, surely, but we are often placing more emphasis on the cost rather than considering risks or potential benefits.

All of these numbers reflect policies, many of which were made in generations (and climates) past. Simply reporting longstanding, unconsidered numbers as progress towards climate adaptation should be called faith-based resilience. Do we know what’s behind those numbers? Are we reporting data to inform ghost policies? We shouldn’t get rid of the numbers we’ve been using, but we place them in the right context. And we can supplement them with resilience data and resilient policies.

John Matthews

Corvallis, Oregon, USA

John Matthews1 Comment