(P2): Surprise and Weak Signals

A goldfish with a look of surprise on the face.

  • Learning By Surprise
  • What is the Adjacent Possible?
  • The Hindsight Bias
  • “I wanted a perfect ending. Now I’ve learned, the hard way, that some poems don’t rhyme, and some stories don’t have a clear beginning, middle, and end. Life is about not knowing, having to change, taking the moment and making the best of it, without knowing what’s going to happen next. Delicious Ambiguity.”
    ― Gilda Radner

Instead of glossing over surprises as failures of understanding, we should focus on them until we have grasped their novelty and how that novelty needs to change our view of reality. We need to avoid abstracting from surprise to make it only another example of what we already know to be true.

Novel occurrences are novel for us, but they are also typically some “next step” from that with which we are already familiar. They are often called the “adjacent possible” because once they have occurred, it is fairly easy to see how they came about. This is true even if no one anticipates them. It is important to remember that in a Complex Adaptive System, there are always many adjacent possibilities for the future.

There is another common problem that results from rationalizing surprises. We look back on the surprise and try to figure out who accurately anticipated it. We think this will improve our prediction capability in the future.

Looking back does improve our understanding of the current situation. It doesn’t improve our ability to predict any genuinely novel future. If we examine what people thought about the future before the novel occurrence, we will see a very large number of ideas about what might happen.  The particular idea about the future that turned out to be accurate had no more or less information about its likelihood than many of the other ideas. The novelty tells us something useful about the current state of the CAS we are in and where it might evolve in the short term. It doesn’t improve our ability to foresee the genuinely new.

Creative Commons Attribution 4.0 International License

Part Two: Detecting and Using Weak Signals (Cynefin)

A Specimen Cynefin Diagram (not the newest, not the oldest).  Simple / Obvious: The simple/obvious domain represents the 'known knowns'. This means that there are rules in place (or best practice), the situation is stable, and the relationship between cause and effect is clear. Complicated: The complicated domain consists of the 'known unknowns'. The relationship between cause and effect requires analysis or expertise; there are a range of right answers. The framework recommends 'sense–analyze–respond': assess the facts, analyze, and apply the appropriate good operating practice. Complex: The complex domain represents the 'unknown unknowns'. Cause and effect can only be deduced in retrospect, and there are no right answers. 'Instructive patterns ... can emerge,' write Snowden and Boone, 'if the leader conducts experiments that are safe to fail.' Cynefin calls this process 'probe–sense–respond'. Chaotic: In the chaotic domain, cause and effect are unclear.[e] Events in this domain are 'too confusing to wait for a knowledge-based response'. managers 'act–sense–respond': act to establish order; sense where stability lies; respond to turn the chaotic into the complex. Disorder / Confusion: The dark disorder domain in the centre represents situations where there is no clarity about which of the other domains apply.

Cynefin is a body of knowledge and tools to assist in changing CAS, among other things. Cynefin, as an enterprise intervention, also has developed a “narrative access and analysis tool” called SenseMaker™. Sensemaker allows the intervenors to accurately access raw views by the participants as short narratives without groupthink or homogenization. It is this ability that allows for the detection of weak signals.

Because SenseMaker has developed an app, it is possible for its users to engage huge numbers of people in a very short time. The example that had the most impact on my understanding of its capacities was an effort to work around the unwillingness of local citizens to say what they actually thought to US civil and military personnel in SE Asia.

The system was used to ask children to relate a story from their grandparents about the most important lesson that the grandparents had learned in their lives. Then the children sent the stories using the SenseMaker app. This project got 50,000 stories in four days.  There is simply nothing else that supports authentic narrative by real participants with the speed of SenseMaker.

Unfortunately for our community, SenseMaker is an enterprise tool and is priced that way. I have been exploring ways we might be able to use this system in our community, but I am some distance from a genuine solution.

That doesn’t mean that we can’t make use of the idea if we can come up with ways to assure fidelity to SenseMaker’s ability to easily access real raw narratives from participants.

I’ll discuss some ideas for using this general framework to get meaningful narratives in our community in later posts. For now, I hope you can see the importance of weak signals in the development and use of our FutureStrategy.

Creative Commons Attribution 4.0 International License

(P1): Why Are Weak Signals Ignored?

A slide: Weak Signals Detection with Social media-No surprise at all? Theory: In contemporary future studies the term weak signals refers to an observed anomaly in the known path or transformation that surprises us somehow. (Kuosa, 2014 p, 22) Our Experiences; Are We Alone? Possible Explanations:  #1 Noisy social media and other limits #2 Filters #3 Customers are Experts #4 Epistemological Limits

Most of the ways we have of finding signals in CAS make us ignore the weak signals.

Surveys, focus groups, social media scans, and almost all the paraphernalia of social studies research homogenize signals to allow the “provable” detection of the Big Signals, the ones that represent larger trends in the CAS. And statistics, as it is usually used in these studies, is designed to relegate weak signals (at best) to a distant periphery where it can be ignored.  Think about what you were taught about the bell-shaped curve, and what you believe is meaningful about the data.

This approach to detecting signals is a framework that our social and profit-driven CAS imposes on us as the meaning of “worthwhile pursuit”.  Weak signals are seen as useless in this framework and, thus, meaningless.

To find weak signals, we have to access the raw narrative that the signal creates once it comes into existence. We have to deliberately prevent the homogenization and loss of the weak signal through our usual methods of assigning meaning to the information. We have to learn to pay attention to the small, weak, and powerless.

Creative Commons Attribution 4.0 International License

(P1): Why the Obvious Problems are the Hardest to Change

A political cartoon from a paper in Massachusetts in 1812 showing a Gerrymandered district just like the ones we have today.

We usually approach change by focusing on the most apparent problem in our environmental horizon (what is called a pain point in customer service). Note that the slide image is a gerrymandering cartoon from 1812, and, in my mind, gives pause to the idea that we can deal with current gerrymandering through normal problem-solving (voting, passing laws, constitutional amendments, getting the right people into office, and so on).

The most obvious problems for us are usually the ones best supported by the operation of the current CAS. There are more diverse forces supporting our obvious problems, and mechanically organized problem solving will miss most of the supporting forces in its quest for changing the obvious. So our problem solving will fail, often in the short run, but eventually in any case.  This can be true even when there are powerful forces supporting change.

Often, our most obvious problems in a complex adaptive system are the core of its strength as a system and support its resilience to meaningful change efforts.

At the same time, the CAS is constantly generating new and sometimes old trends that have been gone for a while. These variations of process are small scale, and we almost never pay any attention to them. No one ever says, “Let’s stop ignoring the flea in the room”.

But the potential for long term change in a CAS lies precisely in these small variations, or in systems theory, “weak signals”. The weak signals are the indicators (not guarantors) of where to look for levers of change.

Creative Commons Attribution 4.0 International License

(P1): Basic Ways of Thinking about CAS

A hand drawn diagram of the Cynefin Framework which is ironically very complex. Text Version through link.

By Edward Stoop at Sketching Maniacs
Text Version of Hand-drawn Cynefin Diagram

Because changing a CAS requires an entirely different way of engaging, we must develop new skills and new ways of perceiving in order to manage the losses we will not be able to avoid and to frame our future actions more strategically. These new skills are not mechanical procedures or recipes. They require ongoing engagement with the CAS and flexibility of response. These two dimensions of our CAS change strategy are the very things we have spent millennia trying to eliminate from our change plans, and our work to increase engagement and flexibility result from the rejection of the “system as machine” mentality.

This is not in any way a moralistic judgment. Unintended consequences don’t occur because there is some personal moral sanction being made by the universe that your actions are bad. Every time we create a short-term advantage for ourselves, we create an unintended and largely unperceived consequence somewhere down the tunnel of the future elsewhere in the CAS.

Humans are evolutionarily favored in devising and using short-term tactics to secure some immediate good. Before states were a reality (say, 7,000 years ago), this worked well for us generally. There was enough room in the world for our waste or mistakes to be recycled as we migrated elsewhere. The world would be “fixed” before we came back to the place we started, as it were. Now, over time, someone will eventually pay for our short-term thinking. Unintended consequences are triggered by all our efforts to stay ahead of the results of our current decisions. And, everyone else is doing the same thing. So, we or our descendants all eventually get burned by the distant actions of someone else. Our tweaks just make things worse over time.

The following posts will focus on one aspect of engaging CAS or another. The image in this slide is itself an engaging way to think about CAS.

Creative Commons Attribution 4.0 International License

(P1): Approaching the Wild CAS

A Large powerful waterfall at Eagle River in Michigan's Upper Peninsula, as an example of a wild CAS
Eagle River

One of the ways of thinking about modern society is that our lives are becoming more like membership in a wild ecosystem. Our common CAS is becoming more like the ecosystems that existed before humans had such a profound impact on nature.

For many centuries, societies have reflected some set of values and outcomes derived from the effort by elites to make society gratify elite needs. But the shift toward a more ecosystem-like CAS is gradually undermining this hierarchical control, and like an ecosystem, it is becoming more difficult for any individual to organize their own future.  Hence, the willingness of Tech tycoons to consider going to another planet in order to preserve their privilege (see linked article above).

Although we think of power as something that an individual has, power is gifted to a person or group by a larger community (human, financial, religious, etc.). It can be and is taken away when the community no longer sees that the person or group contributes to its purpose. While an “apex predator” makes a convenient political metaphor for power and control, in a real ecosystem, the predators die off if the actual ecological basis of their supposed “power” disintegrates.

Our society is becoming more like other evolutionary systems, and there is no guarantee that such a process shift will favor humans (or our disability community), or for that matter anything that now exists. Evolutionary systems favor continuing evolution, not any of the “parts” of the CAS. The continuation of evolutionary change depends on the generation of variation as evolution’s hedge against the uncertainty of the future. That future uncertainty clouds all efforts to control the future and spawns a dodgy business opportunity for anyone willing to claim they can predict the future.

We humans tried to work around that reality by isolating and organizing our exploitation of nature to buffer our goals against the relentlessly increasing complexity of unintended consequences. We are losing that long-standing effort for the same reason that all short-term advantage gives way to the “revenge” of long term biological processes.

My point is, as it is elsewhere, that traditional control behavior is becoming less and less effective and more and more expensive every second of every day.

Creative Commons Attribution 4.0 International License

Part 1: Changing Complex Adaptive Systems (CAS): How Exactly?

A slide entitled Wrangling Complexity, with stick figures working on computers to manage a network shown on a large screen.

There are two problems with the Title Question: How and Exactly.

We automatically use machine model beliefs and operational planning to build and implement change. Such an approach is largely inadequate for a CAS.

Any CAS has mechanical aspects and we can delude ourselves into thinking that because we can change the mechanical aspects with operational planning, that we should be able to change the whole Kit and Kaboodle that way (as in, for example, “you can only change what you can measure”, and similar sophistry).

It ain’t so.

As the slide image suggests, changing a CAS is more like wrangling (cats, say) than standard measurable outcomes based operational planning would have us believe.

If our old standard tools and operating assumptions don’t give us the control we seek, what will?

Well, the bad news is that nothing will give us that amount or scope of control over a CAS. The good news, as unlikely as it might seem, is that we can change the way we approach CAS and be effective.

A CAS is a “dispositional system”, not a mechanical one. A CAS tends to move in directions, but because of the interaction of its parts, there are no guarantees about where it will end up.

Changing a CAS is like approaching a cobra to take close up photos. You can get those photos, but you need to be circumspect to avoid getting bit.

Creative Commons Attribution 4.0 International License

FutureStrategy Recap

The original double attractor that demonstrated the variation and complexity of weather. It looks like a pair of eyes with no path repetition.
Why The Weather is Hard to Predict
  • Twice and thrice over, as they say, good is it to repeat and review what is good.
    -Plato
  • Even if we remember the past, odds are good we’ll still repeat it.
    Guy Gavriel Kay
  • Reminiscence and self-parody are part of remaining true to oneself.
    -John Updike

After a half-century of progress in the advocacy by our community for the support we need to live lives of choice and freedom, our work is stalled. Many forces (together called the #reaction) in the larger society are contributing to this struggle. Changing any one of them will not alter the momentum of this steadily expanding global backlash. Recently, the effort to stop and reverse the progress the disability community has achieved has become increasingly public and has spread to every part of the political and support systems upon which we depend for the quality of our lives and the freedom to make our own choices about how we will live those lives.

The disability community has gradually become a part of the general response of many communities to the current reality, popularly know as the #resistance.

The resistance seeks to restore the momentum of change in the direction it had previously. However, the trouble with resistance is that the past can’t be restored to what it was so that progress can continue as it would have if the forces of reaction hadn’t fought against progress. As necessary as resistance is, it is not enough. Resistance can’t “correct” the complex adaptive system that is our society. Resistance can undermine the momentum of the reaction, but it can’t create a new basis for progress by itself.

While resistance can slow and disrupt the reaction, resistance is meaningless unless we have a strategy which we can use for a counterstroke when the reaction is sufficiently weak. Without that strategy, the result of resistance will be less capable (if less damaging) society, at least as far as our freedom and choice are concerned.

The counterstroke of the disability community must have two phases:

  • We must support the resistance to weaken the reaction because doing so is a tactical necessity for keeping us living our lives.
  • We must begin to build our counterstroke, a response that will reduce our dependence on the social system for those supports we need to survive.

This double-sided approach is FutureStrategy. If we embrace it, we will need to simultaneously try to extract as much support as we can from the society through the resistance, and we will begin to build our own supports, separate from the system and emerging from our local community, based on the system realities we face in each moment of this rapidly, and complexly evolving society.

The image in the slide is the original attractor model that Lorenz used in his discovery of the volatility of weather. I suggest you view it as a metaphor for the two-pronged strategy described here.

Creative Commons Attribution 4.0 International License

Part 6: Onward!!

A large lego structure built to travel and containing ongoing construction on its various levels
Our Future: Mortal Engines
  • “In the midst of chaos, there is also opportunity”
    Sun Tzu
  • How lovely to think that no one need wait a moment, we can start now, start slowly changing the world
    Anne Frank
  • Why always, “not yet”? Do flowers in spring say, “not yet”?
    – Norman Douglas
  • The hour is ripe, and yonder lies the way.
    – Virgil

Though the content of these posts has not been joyfully optimistic, I hope you have found a sense in your heart that we can make a world for ourselves that works hard to maintain and expand our possibilities and choices in our common future.

So we need to get on with it, starting today and continuing with a valued strategy of building community and supports.

Even in the relentless face of our declining value as a community in our larger society.

(P5): Some Strategic Rules of Thumb

A sign on a pole that says, No parking; Monkeys poop on windshield
Words to Live By

These are not procedures or any form of “7 steps to success”. They are guiding notions that can open up possibilities for the situation that you and your personal network face.

What to do when you can’t reduce the uncertainty effectively:

  • The Indirect Approach: Liddell Hart was the person who developed the idea of the “Indirect Approach” in the West, through the use of the idea has always been reflected in effective strategies. Sun Tzu is the best-known example of Eastern Thought on the same issue, and Hart read Sun Tzu as he developed his own framework. The idea is that as you implement your plan, you hide the actual target of that plan. Geographically, you might choose an approach path that is between a number of potentially useful targets, moving toward your choice only at the last minute. Conceptually, you might advocate for small pieces of your advocacy strategy, without revealing how you would use a victory in the small pieces. The purpose of using this approach is to force your opponent to commit scarce resources to the defense of targets that won’t actually turn out to be targets, and to increase uncertainty in your opponent’s planning.
  • Avoid Irreversible Commitments: In the Cynefin model, this is called “fail-safe experiments”. You try out ideas on a small scale to learn more about how they work in the current reality. Then, you ask yourself what you can do to make the successful ones more common.
  • Build Reserves: Reserves are a kind of redundancy that you build to make it easier for you to turn on a dime when your view of the future turns out to be inaccurate. Reserves are not just cash. They include trust, cross-skills training, mutual support, and a host of other things described in various posts. Reserves mean a bunch of different resources that partially overlap. It doesn’t mean a big pile of the same stuff.
  • Weak and Strong Links in Your Network: In network theory, it is useful to distinguish between strong and weak links in a network when thinking about change. The Strong Links are the ones that drive whatever it is that the network is doing. The Weak Links buffer the volatility and unpredictability of the interactions between those strong links so that they don’t cause the network to run away uncontrollably. People who are deeply focused on their own personal power think that weak links are preventing them from increasing their personal power (however they define that) and seek to eliminate them. Interestingly, the two communities that deliberately and successfully eliminate weak links, thus exposing themselves to volatility, are homeless and destitute people, and the 1%.