(P5): Some Strategic Rules of Thumb

A sign on a pole that says, No parking; Monkeys poop on windshield
Words to Live By

These are not procedures or any form of “7 steps to success”. They are guiding notions that can open up possibilities for the situation that you and your personal network face.

What to do when you can’t reduce the uncertainty effectively:

  • The Indirect Approach: Liddell Hart was the person who developed the idea of the “Indirect Approach” in the West, through the use of the idea has always been reflected in effective strategies. Sun Tzu is the best-known example of Eastern Thought on the same issue, and Hart read Sun Tzu as he developed his own framework. The idea is that as you implement your plan, you hide the actual target of that plan. Geographically, you might choose an approach path that is between a number of potentially useful targets, moving toward your choice only at the last minute. Conceptually, you might advocate for small pieces of your advocacy strategy, without revealing how you would use a victory in the small pieces. The purpose of using this approach is to force your opponent to commit scarce resources to the defense of targets that won’t actually turn out to be targets, and to increase uncertainty in your opponent’s planning.
  • Avoid Irreversible Commitments: In the Cynefin model, this is called “fail-safe experiments”. You try out ideas on a small scale to learn more about how they work in the current reality. Then, you ask yourself what you can do to make the successful ones more common.
  • Build Reserves: Reserves are a kind of redundancy that you build to make it easier for you to turn on a dime when your view of the future turns out to be inaccurate. Reserves are not just cash. They include trust, cross-skills training, mutual support, and a host of other things described in various posts. Reserves mean a bunch of different resources that partially overlap. It doesn’t mean a big pile of the same stuff.
  • Weak and Strong Links in Your Network: In network theory, it is useful to distinguish between strong and weak links in a network when thinking about change. The Strong Links are the ones that drive whatever it is that the network is doing. The Weak Links buffer the volatility and unpredictability of the interactions between those strong links so that they don’t cause the network to run away uncontrollably. People who are deeply focused on their own personal power think that weak links are preventing them from increasing their personal power (however they define that) and seek to eliminate them. Interestingly, the two communities that deliberately and successfully eliminate weak links, thus exposing themselves to volatility, are homeless and destitute people, and the 1%.


(P5): Cynefin: A Better Way to Think about Change

Updated Cynefin Model Diagram: 4 quadrants of kinds of systems: OBVIOUS-Tightly constrained; no degrees of freedom; sense-categorize-respond; Use Best Practice. COMPLICATED-Governing Constraints; tightly coupled; sense-analyze-respond; Good Practice. COMPLEX-enabling constraints; loosely coupled; probe-sense-respond; emergent practice. CHAOTIC-lacking constraint; decoupled; act-sense-respond; novel practice.
Updated Cynefin Model

The graphic above is an updated draft of the new formulation of the Cynefin framework. This is a prelude to Dave Snowden’s new book (not yet published) about the entire framework.

Cynefin is the best multi-purpose framework for thinking about systems change I have found. It was created and continues to evolve under Dave Snowden, a Welsh OD person who has never let the necessities of making a living interfere with seeing the truth of what he and his community are creating. If you take nothing else away from this slide, immerse yourself in Dave’s thinking as an antidote to all the useless ways of thinking I have listed over my posts.

Cynefin is a framework for reimagining the interaction between us and the systems around us. The categories of Cynefin are ways of thinking about those interactions. This is not an abstract notion of interacting with a system. It is a way of dealing with the demands that the interaction requires of you.  Our interaction with these systems can be framed as :

1.Obvious: These system interactions are simple enough that you can use an automatic practice. Think of an autogenerated monitoring form report that must be submitted every quarter, and only requires a current signature from the person filling the appropriate role.

2.Complicated: A 777 airplane is complicated because it has a lot of parts and interactions, but the parts don’t change very much because of the interactions. The parts might wear a little or gradually become obsolete, but they don’t change fast just because they interact with one another.

3.Complex: These systems interactions are like the ones referred to in many other parts of my posts in our change work for complex adaptive systems.  In these systems, parts do change all the time because of their interactions with one another. This means that the future of these systems can’t be clearly predicted, and our interactions with them must be viewed as experimental (called fail-safe experiments in Cynefin). We need to learn lessons from these experiments in order to manage our interactions with the system. Over time, it is possible, if “complicated”, to move some complexity to the complicated arena.

4.Chaotic: When unpredictable events occur outside the system, the system with which we are interacting can enter a phase of chaos, when its behavior is entirely unpredictable. We have to try interactions and immediately learn from them to manage chaos. Fortunately, real chaos doesn’t last long.

The deepest lesson of Cynefin for managing change is that those complex systems are very dependent on the evolving context within which we interact with them. If we don’t respect that reality, we will always get unanticipated and generally unpleasant consequences.

P5: Within the Shell of the Old

A multi-colored (blue, brown, and white) spiral sea shell.
An Old Shell

One of the oldest ways of thinking about large scale change in a complex adaptive system is, “Build the new within the shell of the old”. It recognizes the need for change without requiring us to embrace either of the macro revolutionary strategies (change through total governance or breaking down governance and starting from scratch). It also requires us to genuinely struggle with how we use the current system as we move forward.

But building the new within the shell of the old is tough if we are to avoid the horrors of building from scratch or trying to force the existing governing levers to embrace radically new values. We have to balance what we are trying to create, with all it’s fits and starts, with the necessity of our continuing, if evolving, dependence on the current system for our everyday existence. Nowhere is this necessity clearer than it is for the disability community.

The current reality will undermine our ability to change largely through forcing us to repeat habits acquired over long periods of time.

To avoid these barriers, we need to have social networks committed to change, and we need to embrace a new set of assumptions about how change is possible.

(P4): Preparing the Counterstroke

A picture of the Trinity Atomic Explosion, very early after initiation of the explosion when it was still a complex bubble.

Success in preparing a counterstroke requires a great deal:

  • Giving up the notion that you can restore what was.
  • The carving out of some space that is hidden to create the counterstroke. Why it is “hidden” doesn’t matter. It can be (and often is) that the insurgent simply doesn’t believe that you are capable of preparing a counterstroke.
  • Creating a genuinely new and disruptive complex system for your counterstroke. As an ongoing real-world example, women are on the ascendance everywhere in the world, regardless of political disagreements across communities of women, because they are coherently creating something new together. Men, on the other hand, are on the decline, because they are trying to defend what they have. All else being equal, creating something new will outlast and eventually replace the current reality, no matter how much effort, even successful effort, is put into the defense.
  • Never allowing fear to trigger a premature counterstroke.
  • Waiting for the maximum feasible disintegration of the insurgent before launching the counterstroke.

Note how foundational patience is in the success of all this.

(P4): Problems with Assessing Future Risk and Uncertainty

A swampy marsh with a fog making it hard to see any distance.

  • There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are things we don’t know we don’t know. -Donald Rumsfeld

Human beings are notoriously poor at estimating risk in the real world. We are bad at it even when we are not affected by bias.  But, we are all affected by bias:

  • Confusing Uncertainty and Risk: We often assume that the risk of uncertain events can be calculated or intuited. That isn’t true in most of the real world. And, to the extent that we use risk calculation to make decisions when dealing with uncertainty, we will make bad decisions. Think Fukushima.
  • Eliminating versus Mitigating Risk: Especially in nonprofit and public organizations, there is a belief that by eliminating the possibility of risk through an HR policy or some threshold limit, that we have actually protected ourselves or the organization. For-profit organizations tend to look for ways to mitigate rather than eliminate risk since they have a better appreciation of how difficult a challenge any uncertainty actually is.
  • Bias in Driven Behavior: Assessing risk and uncertainty when the person or organization is using driven behavior (sex, drugs, and rock and roll for people, hyper-focus in organizations (or cults) as a way to deny uncertainty, fear of liability or some other unseen threat) is guaranteed to give you a false sense of actual uncertainty.
  • Prospect Theory: This is the name for the bias that increases commitment to an already losing strategy. Endless examples……
  • Behaving as though the nonlinear world is actually linear. Examples are the belief in single causes, that effort is proportional to an outcome, that starting points that are close to one another should have closely linked outcomes. There are many more.

Knowing that risk and uncertainty are not the same and that we tend to bias our estimates of them is not enough to prevent the problems mentioned above. We have to actually build our ability to overcome the bias and reflect on our inability to estimate uncertainty in our strategy and our planning.

(P3): Compounding Error

A fully detailed fossil of a small crocodile-like dinosaur in a dark rock.
Death Of The Dinosaurs

There is a deep similarity between the way we have used fossil fuels and debt to drive our political and financial economies, respectively. And the results of this use are also very similar:

  • There are unavoidable limits to both. These limits are not just an amount (quantity in fuels and bubble size in debt), but that both become increasingly difficult to extract or expand as their use increases.
  • The habit of their use also makes it increasingly difficult to change their use or reduce the level of use. This is a species of addiction. If an addict community’s purpose is to facilitate the procurement and use of some drug, our habits in the use of fossil fuels and debt accomplish the same outcome.
  • Their use is always to allow short-term success and a parallel ignoring of long-term consequences.
  • When the consequences become too great to ignore, very significant costs are required to alleviate these accumulated consequences.
  • At the same time, the costs of dealing with the consequences of short-term, non-strategic use further undermines the original advantage of their use.

This cycle of short-term planning in the use of resources (fossil fuels and debt) and the lack of attention to consequences is fractal. That is, the errors occur systemically at every level. It is a characteristic of our complex adaptive social system, and it has as much to do with the momentum of our ever-increasing loss of control over our future as anything else that we believe to be wrong in our society.

We can’t use the way we created and maintain the degradation of our society to change that degradation in anything other than small ways (that is, in ways that don’t change the strategic degradation but may make it worse).  These small improvements will wash out in the same way that ripples from the splash of a small stone wash out in a river.

There is no logic model to resolve the forces driving our complex adaptive disintegrating social system. We need a strategy, not more short-term operational planning.

(P1): Driven behavior always biases risk assessment

Movie stunt of an actor leaping from one car to another in mid air. There is a tank crushing the car he is leaping from
What if this wasn’t a stunt?


Some adolescent males reliably do very dangerous and stupid things that violate common sense. A meaningful number die as a result. They do so because they are driven to show off. They don’t think about getting hurt or dying, they don’t assess the cost of their death on those who care about them, they don’t pay any attention to who else will be hurt. They do this because they are driven to take risks and pay no attention to the actual uncertainty of injury. In the decision-making moment, they have no common sense.

Lest you think that this is only an issue for teenage boys, let me remind you that all driven behavior whether toward or away from something (drugs, sex, rock and roll, mountain climbing, extreme sports, casual investment, gambling, and all the subsidiary behaviors that go along with driven behavior, as well as all fear-driven choices) reliably produce an ignoring of risk and uncertainty or severe underestimation or overestimation of it.

This lack of respect for real uncertainty and the acceptance of the real uncertainty in life becomes especially damaging when the decision makers have no “skin in the game”, which is to say when someone else pays the price for their decisions. There are so many examples of this, and there are so many new ones that surface every day, I won’t bother to give examples.