Build Resilience

Why more catastrophic failures will occur

Catastrophic failures are infrequent events that have huge severity. Despite all the efforts of strengthening Robustness, we can expect more disasters to occur as the level of complexity in the world increases everyday. This is natural science at play.

This graph is from the 2008 paper Managing in a Pareto World calls for New Thinking. The authors state that complexity science focuses on power laws, long tails, extreme events, fractals and other Pareto-related effects. These recognize the nonlinear interdependencies among agents such as feedback loops, pattern formation, no central control. Power laws are ubiquitous in the social and organizational worlds.

The Order system is home to Bell or Gaussian curves (high probability, low impact) and the application of probability statistics. If we have a large enough sample for measuring the mean of the height of humans, we can create a normal distribution curve. This allows experts in the Complicated domain to predict how tall the next person is likely to be.

In the Complex domain, events follow a Pareto distribution (low probability, high impact). The curve shape is a long or fat tail. In terms of risks, the events are synonymous with Taleb’s Black Swans.

Copyright© 2016 Bangor University or Cogniive Edge Pte Ltd.

Overlaying the 2 distribution curves (Pareto red, Gaussian blue) highlights what’s different at the extreme end. The Pareto curve is higher at the extreme than the Gaussian. Gaussian events at the extreme are called outliers.
Common statistical practice is to focus on events within a confidence level around the mean. We are also conditioned to ignore outliers and don’t pay attention to them. This is why many have a difficult time believing more catastrophic failures will occur. It’s no longer a probable world of deductive, cause & effect reasoning or a possible world of predictive, inductive reasoning. It’s a plausible world that requires abductive reasoning.

When dealing with complex systems, not only do we recognize failures will indeed occur, we accept failure as a means to learn. Eric Hollnagel, David Woods, and Nancy Leveson offer a different view:

“Failure is the result of the adaptations necessary to cope with the complexity of the real world, rather than a breakdown or malfunction.”

Compounding the problem of dealing with disasters is emergency preparedness. How does an organization ready itself for an earthquake when the Richter scale magnitude is unknowable? How many different plans should be prepared?

Three ways to build Resilience

Resilience is taking a hit and bouncing back from where you started. It could be bouncing to a better spot than the original. The best is anticipating a hit may be coming and taking action to preventing it from occurring.

Path A (red) is most often followed and is a quick restoration back to the Obvious domain. However, path B (green) is also a viable option. This is a serendipitous opportunity which has surprisingly emerged.

Path B is moving to the Complex domain to exploit the opportunity in a rapid / speedy fashion. We probe the system with safe-to-fail experiments to observe behaviour patterns – what attracts people, what they don’t like. Once a new pattern-based solution is found, we move into the Complicated domain where we do fail-safe documentation on process, system, and structure changes.

Path C is one more capability called Anticipatory Awareness.  This is early detection but very different than Robustness barriers. All that Robustness can deal with are known hazards and identified risks in the Order system. Resilience deals with unknown unknowns, unknowables and unimaginables in the Chaotic and Complex domains.

In the Age of Cognitive Complexity, the brain is no longer viewed as a logical processor similar to a computer designed for storing and retrieving information. It’s a myth. Cognitive science research has revealed the brain is built for pattern recognition and first-fit matching. This is our survival instincts. The brain knows how to quickly recognize the emergence of danger. To search for an appropriate action (e.g., fight or flight), it runs through a “rolodex” of experiences and stops with the first match between recognized danger and “What I did in the past.” That experience could be the most recent one or a habit called a heuristic that has formed from knowledge and practice over the years.

The upside is that humans are much faster at anticipating impending danger than machines. So a complete resilience strategy needs to engage human intelligence as sensors to prevent failures from happening. The downside is that humans are not perfect. We are fallible and make mistakes. We get physically and mentally tired. And we are easily distracted which impacts our degree of alertness. Therefore the build resilience strategy is not to automate and remove humans from the sensing loop but to augment human intelligence with technology to overcome human frailties.

Those wishing more information can watch this YouTube video produced by Cognitive Edge co-founder Dave Snowden.