(P1): Why Are Weak Signals Ignored?

A slide: Weak Signals Detection with Social media-No surprise at all? Theory: In contemporary future studies the term weak signals refers to an observed anomaly in the known path or transformation that surprises us somehow. (Kuosa, 2014 p, 22) Our Experiences; Are We Alone? Possible Explanations:  #1 Noisy social media and other limits #2 Filters #3 Customers are Experts #4 Epistemological Limits

Most of the ways we have of finding signals in CAS make us ignore the weak signals.

Surveys, focus groups, social media scans, and almost all the paraphernalia of social studies research homogenize signals to allow the “provable” detection of the Big Signals, the ones that represent larger trends in the CAS. And statistics, as it is usually used in these studies, is designed to relegate weak signals (at best) to a distant periphery where it can be ignored.  Think about what you were taught about the bell-shaped curve, and what you believe is meaningful about the data.

This approach to detecting signals is a framework that our social and profit-driven CAS imposes on us as the meaning of “worthwhile pursuit”.  Weak signals are seen as useless in this framework and, thus, meaningless.

To find weak signals, we have to access the raw narrative that the signal creates once it comes into existence. We have to deliberately prevent the homogenization and loss of the weak signal through our usual methods of assigning meaning to the information. We have to learn to pay attention to the small, weak, and powerless.

Creative Commons Attribution 4.0 International License

(P1): Weak Signals

A network diagram with a large hub (strong links) and many smaller hubs (weak links)

One way to think about weak signals is through network modeling.

It is intuitive to view strong links in a network as the important ones and the weak links as unnecessary details or random defects in the network that don’t contribute to the purpose or function of the network. But in complex adaptive systems, strong links generate volatile unpredictable behavior. Weak links buffer the volatility of the activity of these strong links and are largely responsible for the stability of the network, even as the CAS goes about its merry way.

Interestingly, there are two communities that deliberately eliminate weak links from their social lives:

  • People who are homeless and desperate, I suppose because they believe that persons they don’t know very well are persons they can’t trust.
  • Very rich people, I suppose because they believe that those who aren’t their peers can’t be trusted and are after their money.

Both of these communities are largely right in their loss of trust for weak links, which says something about their location in the current CAS and their personal futures in the CAS.

Note that authoritarian regimes and cults both eliminate weak links in the belief that their survival only depends on the strong links that produce (in their view) their power and wealth. These kinds of “strong” CAS are notoriously volatile and readily suffer collapse if any insurgency can disturb the control.

This framework maps to the basic CAS change concept that powerful system trends are very difficult to control for positive change (even if they might support our change). The best opportunity for change lies in the weak links, because they are small now, but can grow to have much greater influence.

But identifying the weak links that might be the best support for CAS change efforts remains difficult because those links aren’t poking us in the face.

Creative Commons Attribution 4.0 International License

(P1): Why the Obvious Problems are the Hardest to Change

A political cartoon from a paper in Massachusetts in 1812 showing a Gerrymandered district just like the ones we have today.

We usually approach change by focusing on the most apparent problem in our environmental horizon (what is called a pain point in customer service). Note that the slide image is a gerrymandering cartoon from 1812, and, in my mind, gives pause to the idea that we can deal with current gerrymandering through normal problem-solving (voting, passing laws, constitutional amendments, getting the right people into office, and so on).

The most obvious problems for us are usually the ones best supported by the operation of the current CAS. There are more diverse forces supporting our obvious problems, and mechanically organized problem solving will miss most of the supporting forces in its quest for changing the obvious. So our problem solving will fail, often in the short run, but eventually in any case.  This can be true even when there are powerful forces supporting change.

Often, our most obvious problems in a complex adaptive system are the core of its strength as a system and support its resilience to meaningful change efforts.

At the same time, the CAS is constantly generating new and sometimes old trends that have been gone for a while. These variations of process are small scale, and we almost never pay any attention to them. No one ever says, “Let’s stop ignoring the flea in the room”.

But the potential for long term change in a CAS lies precisely in these small variations, or in systems theory, “weak signals”. The weak signals are the indicators (not guarantors) of where to look for levers of change.

Creative Commons Attribution 4.0 International License

(P1): Basic Ways of Thinking about CAS

A hand drawn diagram of the Cynefin Framework which is ironically very complex. Text Version through link.

By Edward Stoop at Sketching Maniacs
Text Version of Hand-drawn Cynefin Diagram

Because changing a CAS requires an entirely different way of engaging, we must develop new skills and new ways of perceiving in order to manage the losses we will not be able to avoid and to frame our future actions more strategically. These new skills are not mechanical procedures or recipes. They require ongoing engagement with the CAS and flexibility of response. These two dimensions of our CAS change strategy are the very things we have spent millennia trying to eliminate from our change plans, and our work to increase engagement and flexibility result from the rejection of the “system as machine” mentality.

This is not in any way a moralistic judgment. Unintended consequences don’t occur because there is some personal moral sanction being made by the universe that your actions are bad. Every time we create a short-term advantage for ourselves, we create an unintended and largely unperceived consequence somewhere down the tunnel of the future elsewhere in the CAS.

Humans are evolutionarily favored in devising and using short-term tactics to secure some immediate good. Before states were a reality (say, 7,000 years ago), this worked well for us generally. There was enough room in the world for our waste or mistakes to be recycled as we migrated elsewhere. The world would be “fixed” before we came back to the place we started, as it were. Now, over time, someone will eventually pay for our short-term thinking. Unintended consequences are triggered by all our efforts to stay ahead of the results of our current decisions. And, everyone else is doing the same thing. So, we or our descendants all eventually get burned by the distant actions of someone else. Our tweaks just make things worse over time.

The following posts will focus on one aspect of engaging CAS or another. The image in this slide is itself an engaging way to think about CAS.

Creative Commons Attribution 4.0 International License