Resiliency & Interdependencies Insights
California Resiliency Alliance briefs contain a Resiliency & Interdependencies Insights section, which provides a short piece to prompt further thinking around a topic or concept. Below is a collection of the insights shared as part of the briefs.
February 22, 2021 Insight
The phrase “Lessons Learned” is often used in reference to take ways from exercises and events. There is also often discussion that lessons learned is the wrong phrase because so often we see the same lessons being “learned” event after events, sometimes even by the same people. In Farnam Street’s article Stop Crashing Planes, there is a concept that captures this gap well: “Transmission loss, the gap between learning and execution.” How can we start tuning our processes to minimize transmission loss?
February 8, 2021 Insight
“Without feedback loops experience does not make us into better decision makers. Without those feedback loops, we don’t develop the skills to tell us whether adding a tablespoon of salt will yield a bland soup or a salty mess.” – from the book Meltdown: Why Systems Fail
What are the gaps in the formal and informal feedback loops you, your team, and your organization are using that may result in lessons falling through the cracks and never being “learned”? What are the near misses that are going unnoticed?
October 26, 2020 Insight
Hazard & Damage Assessments – Survivorship Bias: What are the stories not being told because no one is around to tell them (or able to tell them)? Survivorship bias is when we end up with a false representation of reality because we base our understanding of something on the experiences of those who “lived to tell their story.” This may result in our perceptions of risk being skewed towards the ones more people talk about or unintentionally drown out the risks and challenges faced by those unable share their story, perhaps because of communications breakdowns.
A short article about Survivorship Bias – What Sharks Can Teach Use About Survivorship Bias (Farnam Street, October 2020)
October 8, 2020 Insight
“In ecology there is a concept called Shifting Baseline Syndrome (SBS). It is the gradual change in the accepted norms for the condition of the natural environment due to a lack of human experience, memory and/or knowledge of its past condition. The consequences of SBS “”include an increased tolerance for progressive environmental degradation, changes in people’s expectations as to what is a desirable (worth protecting) state of the natural environment, and the establishment and use of inappropriate baselines for nature conservation, restoration and management.” In Emergency Management / Business Continuity we face similar shifts in our baseline of normal, especially in regards to hazards, threats, risks, and impacts. Like with the ecology concept, the shifts of our baseline of normal also have consequences – hidden vulnerabilities creeping into complex systems, changes in expectations and how we define acceptable risk, and defining event outcomes based on past events. This shifting baseline is not always negative, but not recognizing its influence on decision making can create unintended side effects.
Source for information about Shifting Baseline Syndrome in ecology – Shifting baseline syndrome: causes, consequences and implications
September 2, 2020 Insight
Communication – “People using ambiguous mediums thing they are communicating clearly because they know what they mean to say, receivers are unable to get this meaning accurately, but are certain that they have interpreted the message accurately.” Nicholas Epley in his book Mindwise: How We Understand What Others Think, Believe, Feel, and Want
August 5, 2020 Insight
Identifying failure points in complex systems – “If a system is complex, our understanding of how it works and what’s happening in it is less likely to be correct, and our mistakes are more likely to be combined with other errors in perplexing ways. And tight coupling makes the resulting failures harder to contain.”
~ Chris Clearfield and András Tilcsik; Meltdown: Why Our System Fail and What We Can Do About It
July 22, 2020 Insight
The value of information is best determined by what is called value-in-use — “a benefit the user obtains from the use and the effect of the use”. Value-in-use is subjective and specific to a user — so the value of information could be defined simply as contingent upon its usefulness to an individual.
~ Information Disasters and Disaster Information: Where Information Science Meets Emergency Management, by Tisha Slagle Pipes, University of North Texas
July 8, 2020 Insight
“A complex system will malfunction if an of its essential components fails. Even when the likelihood of failure in each component is slight the probability of an overall failure can be high if many components are involved.” ~ Daniel Kahneman and Amos Tversky in the article Judgement Under Uncertainty: Heuristics and Biases
June 24, 2020 Insight
“Crisis can spread globally, an in our modern world they can easily also impact business sector that at first glance do not seem exposed.”
– Quote from article in the International Journal of Disaster Risk Reduction (2016): Resilience in a Complex World – Avoiding Cross-Sector Collapse
June 10, 2020 Insight
“It is possible to improve the performance of each part or aspect or a system taken separately and simultaneously reduce the performance of the whole.” ~ Russell Ackoff, Professor Emeritus of Management Science at the Wharton School, University of Pennsylvania
May 27, 2020 Insight
“The key questions when looking at the resilience of our current societies are (i) how much flexibility do we have left, and (ii) how can we carry on from today.”
~ Quote from Resilience in a Complex World – Avoiding Cross-Sector Collapse, an article by Stephan Lechner, Jack Jacometti, Gordon McBean, and Neil Mitchison in the International Journal of Disaster Risk Reduction (2016)
May 13, 2020 Insight
“Were a perfect model possible, one that completely and accurately represented the dynamics and complexity of its object, then its very specificity would defeat the purpose of modeling. So models always make sacrifices of some kind. The question, though is whether our models sacrifice inconsequential aspects of the worlds we wish to understand and control, or vital aspects.” Quote from the book – Drift into Failure: From Hunting Broken Components to Understanding Complex Systems by Sidney Dekker
March 4, 2020 Insight
“In all human systems and most complex systems, the second layer of effects often dwarfs the first layer, yet often goes unconsidered. In other words, we must consider that effects have effects.” – Mental Models: The Best Way too Make Intelligence Decisions, Farnam Street
February 19, 2020 Insight
“There is a world of difference between simple and simplistic. The distinction lies in understanding what is essential and meaningful as opposed to what is not, then ruthlessly eliminating the latter, while putting emphasis and focus on the former.” Quote from the book Simple: Conquering the Crisis of Complexity by Alan Siegel and Irene Etzkorn
February 5, 2020 Insight
“News is about things that happen, not things that don’t happen. Since the human mind estimates probability by the ease with which it can recall examples, news readers will always perceive that they live in dangerous times.” A quote by Steven Pinker and Andrew Mack in the book Meltdown: Why Our Systems Fail and What We Can Do About It (by Chris Clearfield and Andras Tilcsik)
The news and social media influence our perceptions about the risk of the 2019 Novel Coronavirus with the images and stories shared.
January 22, 2020 Insight
“Globalization and the digital revolution have led to more interdependencies, higher complexity and rapid acceleration of change in most sectors of our societies and economies. For this reason, the long-term resilience of a nation, a region or an industry cannot be considered any more as a confined matter that has little to do with the global environment.” Source: Resilience in a complex world – Avoiding cross-sector collapse, International Journal of Disaster Risk Reduction, 2016
January 8, 2020 Insight
“An economy’s ability to recover from a catastrophe is demonstrated by the speed and extent to which it reconstructs factories and homes, repairs damaged infrastructure, regains consumer and market confidence, and re-engages in business activities after an event.” Quote from the Cambridge Global Risk index 2019 Executive Summary.
December 18, 2019 Insight
“We don’t look enough at the relationships of components. Day to day we focus too narrowly and short-term, so your problem-solving approach doesn’t consider the whole system.” Quote from Lessons We Don’t Learn: A Study of the Lessons of Disasters, Why We Repeat Them, and How We Can Learn Them, published in Homeland Security Affairs.
December 4, 2019 Insight
At a Grocery Supply Chain Resilience Project meeting, which the CRA attended, some of the conversation involved the classic debate between opening CPODs (Commodity Points of Distribution) and getting grocery stores open immediately post disaster. During the discussion the concept was raised that in the end it is not about CPODs or grocery stores, but about commodity provisioning – enabling the flow and distribution of goods in communities via most effective channels given the post-disaster conditions and resource constraints.