Resiliency & Interdependencies Insights

California Resiliency Alliance briefs contain a Resiliency & Interdependencies Insights section, which provides a short piece to prompt further thinking around a topic or concept. Below is a collection of the insights shared as part of the briefs.


 

April 12, 2022 Insight

Economic Impact Dollars – Year of the Currency: As the current inflation has shown, $1 from one year is not necessarily directly equal to $1 from another year. Reports on estimated economic impacts for disasters often do not include the year of the currency valuation. The year might not even be the year of the report as studies often take years to complete and/or may use economic data from earlier reports, which in turn themselves may use earlier data. Sometimes reports even combine data from a wide range of years. Treating a value as if it is current when in reality it is older can hide significant differences. For example a $10 million dollar impact estimate from 2012 after adjusting for inflation is about $12.4 million in 2022. That is an additional 24% and does not take into account any other changes to communities and infrastructures in the intervening years. When looking into economic impact estimates, you may need to dig through footnotes and citations to to try and find information on the year(s).

 


 

March 3, 2022 Insight

How We Think About Systems: A word often used today when discussing resilience and vulnerabilities is systems. Usually no distraction made between simple, complicated, and complex systems, yet each of them functions and behaves somewhat differently. While complex systems have greater exposure to disruptions, they also have more adaptive capacity to respond to shocks. The challenge with complex systems is that they fail exponentially, there is no pre-determinable linear approach to failure.


 

January 24, 2022 Insight

Future Risk: “Tomorrow’s risk is being built today. We must therefore move away from risk assessments that show risk at a single point in the present—which can quickly become outdated—and move instead towards risk assessments that can guide decision makers towards a resilient future.” Except from The making of a riskier future: How our decisions are shaping future disaster risk


 

July 5, 2021 Insight

Our Mental Models & Three Biases that Impact Disaster Preparation: “One is, simply put, that there is a tendency to under-appreciate the future or under-consider the future, or future consequences. A second thing is that people are too quick to forget the past, or too slow to remember the negative events that have happened in the past. The third one is that if in doubt, what often happens is that people will follow the advice of other people who are no less prone to those sorts of mistakes than they are. … for a lot of hazards, people have really bad mental models of how things are going to unfold. … people grossly underestimated a couple of things — one of which is how long the after-effects of the storm were going to be. … The basic problem is how do you get people to have a better mental model, or to be able to better mentally simulate what’s going to happen to them so that they can line up what’s going to happen to them with how they’re going to prepare for it.” Excerpt from The Faulty ‘Mental Models’ That Lead to Poor Disaster Preparation, Knowledge @ Wharton, July 7, 2014


 

May 31, 2021 Insight

Goal of Intelligence: “The goal of intelligence is to inform and narrow the range of uncertainty within which a decision must be made.” – quote attributed to former National Security Advisor Brent Snowcroft in Errors in Intelligence Analysis by Rebeka Melber.


 

May 3, 2021 Insight

Sometimes Knowledge is Not the Issue: Afghanistan is currently in the news a lot also provides a good case study on preparedness messaging. The Afghanistan’s Guardians of Peace Program was launched back in 2010. Its goal was to get locals to call and report when they saw Taliban fighters. Thousands of handouts were distributed, messages were played over speakers, $2 million was spent on billboards and banners, but few people called. The program leaders thought the problem was people not knowing the number. Yet when they eventually talked with the locals they found out that the locals knew the number and wanted to call. The issue was that the Taliban disabled the cell towers before coming. This short example is from Nicholas Epley’s book Mindwise. It is an illustrative example that the lack of knowledge is not necessarily the barrier to action. In this example, the people of Afghanistan had the knowledge, the cell phones, and even cell towers, but at the times when they needed it the infrastructure was not there to allow them to complete the action. As we look at motivating preparedness, if there has been active messaging, yet slow or little action, what are the real on the ground conditions for the individuals we are trying to engage?


 

April 19, 2021 Insight

Importance of Context: The Oxford dictionary defines context as: “the circumstances that form the setting for an event, statement, or idea, and in terms of which it can be fully understood and assessed.” In his book Mindwise: How we understand what others think, believe, fell, and want, Nicholas Epley shares how “misunderstanding the power of context can lead us to design ineffective solutions to important problems.” Best practices are often sought after in our field, yet one of the challenges of ‘best practices’ is that they tend to remove the solution from its context. When implemented elsewhere, the change in context, can mean they don’t result in the desired effects and perhaps even create undesirable unintended consequences. When seeking to borrow ’solutions’ ask yourself about how the contexts may differ.


 

April 5, 2021 Insight

Ambiguity & The Under the Radar Risk of Hybrid Threats: “An inherent characteristic of Hybrid Threats entails blurring traditional dichotomies and creating ambiguity. Individuals have an inherent preference for thinking in dichotomies (true/false, friend/enemy, etc.) and decision-making is largely based on such a way of thinking. Ambiguity on the other hand hinders decision-making at an individual and a collective level by creating confusion and distrust. Being ‘under the radar’ as much as possible is one of the characteristics of hybrid threat activities.” – from The landscape of hybrid threats: A conceptual model

 


 

March 8, 2021 Insight

Low Probability Events in Complex Systems: The interdependencies in complex systems means that we must look not just the probability of a single event happening, but the probability of any event with the potential to cause significant disruption occurring. For example if there are 20 different events that may have a significant impact on a system each with a low probability of 1 in 50 (2%) of occurring over a given timespan, the overall probability of any one of those events happening is not 1 in 50, but actually 1 in 3 within that timespan. The math works like this – the probability of any single event not happening is 49 out of 50, but you need all 20 events not to happen; therefore, it is (49/50)^20, which gives you 66% or a 2 out of 3 likelihood, meaning that there is actually a 1 in 3 probability that one of the 20 events will happen within that given timespan and potentially disrupt your system.


 

February 22, 2021 Insight

Lessons “Learned” and Transmission Loss: The phrase “Lessons Learned” is often used in reference to take ways from exercises and events. There is also often discussion that lessons learned is the wrong phrase because so often we see the same lessons being “learned” event after events, sometimes even by the same people. In Farnam Street’s article Stop Crashing Planes, there is a concept that captures this gap well: “Transmission loss, the gap between learning and execution.” How can we start tuning our processes to minimize transmission loss?


 

February 8, 2021 Insight

Feedback Loops: “Without feedback loops experience does not make us into better decision makers. Without those feedback loops, we don’t develop the skills to tell us whether adding a tablespoon of salt will yield a bland soup or a salty mess.” – from the book Meltdown: Why Systems Fail

What are the gaps in the formal and informal feedback loops you, your team, and your organization are using that may result in lessons falling through the cracks and never being “learned”? What are the near misses that are going unnoticed?


 

October 26, 2020 Insight

Hazard & Damage Assessments – Survivorship Bias: What are the stories not being told because no one is around to tell them (or able to tell them)? Survivorship bias is when we end up with a false representation of reality because we base our understanding of something on the experiences of those who “lived to tell their story.” This may result in our perceptions of risk being skewed towards the ones more people talk about or unintentionally drown out the risks and challenges faced by those unable share their story, perhaps because of communications breakdowns.

A short article about Survivorship Bias – What Sharks Can Teach Use About Survivorship Bias (Farnam Street, October 2020)


 

October 8, 2020 Insight

Shifting Baseline Syndrome: “In ecology there is a concept called Shifting Baseline Syndrome (SBS). It is the gradual change in the accepted norms for the condition of the natural environment due to a lack of human experience, memory and/or knowledge of its past condition. The consequences of SBS “”include an increased tolerance for progressive environmental degradation, changes in people’s expectations as to what is a desirable (worth protecting) state of the natural environment, and the establishment and use of inappropriate baselines for nature conservation, restoration and management.” In Emergency Management / Business Continuity we face similar shifts in our baseline of normal, especially in regards to hazards, threats, risks, and impacts. Like with the ecology concept, the shifts of our baseline of normal also have consequences – hidden vulnerabilities creeping into complex systems, changes in expectations and how we define acceptable risk, and defining event outcomes based on past events. This shifting baseline is not always negative, but not recognizing its influence on decision making can create unintended side effects.

Source for information about Shifting Baseline Syndrome in ecology – Shifting baseline syndrome: causes, consequences and implications


 

September 2, 2020 Insight

Communication: “People using ambiguous mediums thing they are communicating clearly because they know what they mean to say, receivers are unable to get this meaning accurately, but are certain that they have interpreted the message accurately.” Nicholas Epley in his book Mindwise: How We Understand What Others Think, Believe, Feel, and Want


 

August 5, 2020 Insight

Identifying Failure Points in Complex Systems: “If a system is complex, our understanding of how it works and what’s happening in it is less likely to be correct, and our mistakes are more likely to be combined with other errors in perplexing ways. And tight coupling makes the resulting failures harder to contain.”
~ Chris Clearfield and András Tilcsik; Meltdown: Why Our System Fail and What We Can Do About It


 

July 22, 2020 Insight

Value of Information: The value of information is best determined by what is called value-in-use — “a benefit the user obtains from the use and the effect of the use”. Value-in-use is subjective and specific to a user — so the value of information could be defined simply as contingent upon its usefulness to an individual.
~ Information Disasters and Disaster Information: Where Information Science Meets Emergency Management, by Tisha Slagle Pipes, University of North Texas


 

July 8, 2020 Insight

“A complex system will malfunction if an of its essential components fails. Even when the likelihood of failure in each component is slight the probability of an overall failure can be high if many components are involved.” ~ Daniel Kahneman and Amos Tversky in the article Judgement Under Uncertainty: Heuristics and Biases


 

June 24, 2020 Insight

“Crisis can spread globally, an in our modern world they can easily also impact business sector that at first glance do not seem exposed.”
– Quote from article in the International Journal of Disaster Risk Reduction (2016): Resilience in a Complex World – Avoiding Cross-Sector Collapse


 

June 10, 2020 Insight

“It is possible to improve the performance of each part or aspect or a system taken separately and simultaneously reduce the performance of the whole.” ~ Russell Ackoff, Professor Emeritus of Management Science at the Wharton School, University of Pennsylvania


 

May 27, 2020 Insight

“The key questions when looking at the resilience of our current societies are (i) how much flexibility do we have left, and (ii) how can we carry on from today.”

~ Quote from Resilience in a Complex World – Avoiding Cross-Sector Collapse, an article by Stephan Lechner, Jack Jacometti, Gordon McBean, and Neil Mitchison in the International Journal of Disaster Risk Reduction (2016)


 

May 13, 2020 Insight

“Were a perfect model possible, one that completely and accurately represented the dynamics and complexity of its object, then its very specificity would defeat the purpose of modeling. So models always make sacrifices of some kind. The question, though is whether our models sacrifice inconsequential aspects of the worlds we wish to understand and control, or vital aspects.” Quote from the book – Drift into Failure: From Hunting Broken Components to Understanding Complex Systems by Sidney Dekker


 

March 4, 2020 Insight

“In all human systems and most complex systems, the second layer of effects often dwarfs the first layer, yet often goes unconsidered. In other words, we must consider that effects have effects.” – Mental Models: The Best Way too Make Intelligence Decisions, Farnam Street


 

February 19, 2020 Insight

“There is a world of difference between simple and simplistic. The distinction lies in understanding what is essential and meaningful as opposed to what is not, then ruthlessly eliminating the latter, while putting emphasis and focus on the former.” Quote from the book Simple: Conquering the Crisis of Complexity by Alan Siegel and Irene Etzkorn


 

February 5, 2020 Insight

“News is about things that happen, not things that don’t happen. Since the human mind estimates probability by the ease with which it can recall examples, news readers will always perceive that they live in dangerous times.” A quote by Steven Pinker and Andrew Mack in the book Meltdown: Why Our Systems Fail and What We Can Do About It (by Chris Clearfield and Andras Tilcsik)

The news and social media influence our perceptions about the risk of the 2019 Novel Coronavirus with the images and stories shared.


 

January 22, 2020 Insight

“Globalization and the digital revolution have led to more interdependencies, higher complexity and rapid acceleration of change in most sectors of our societies and economies. For this reason, the long-term resilience of a nation, a region or an industry cannot be considered any more as a confined matter that has little to do with the global environment.” Source: Resilience in a complex world – Avoiding cross-sector collapse, International Journal of Disaster Risk Reduction, 2016


 

January 8, 2020 Insight

“An economy’s ability to recover from a catastrophe is demonstrated by the speed and extent to which it reconstructs factories and homes, repairs damaged infrastructure, regains consumer and market confidence, and re-engages in business activities after an event.” Quote from the Cambridge Global Risk index 2019 Executive Summary.


 

December 18, 2019 Insight

“We don’t look enough at the relationships of components. Day to day we focus too narrowly and short-term, so your problem-solving approach doesn’t consider the whole system.” Quote from Lessons We Don’t Learn: A Study of the Lessons of Disasters, Why We Repeat Them, and How We Can Learn Them, published in Homeland Security Affairs.


 

December 4, 2019 Insight

At a Grocery Supply Chain Resilience Project meeting, which the CRA attended, some of the conversation involved the classic debate between opening CPODs (Commodity Points of Distribution) and getting grocery stores open immediately post disaster. During the discussion the concept was raised that in the end it is not about CPODs or grocery stores, but about commodity provisioning – enabling the flow and distribution of goods in communities via most effective channels given the post-disaster conditions and resource constraints.