S01E07 Seeing the whole: systems thinking
Sense-making in complex systems, mental models, perils of reductionism, bullwhip effect in supply chain management, law of unintended consequences
So far, I argued that we shouldn’t take our beliefs about organization for granted. These beliefs are rooted in a worldview that is itself rooted in our understanding of nature and physics. In last week’s episode, it became clear that information technology is making the world more interconnected and volatile.
So we live in an increasingly VUCA world now. That doesn’t mean we have to abandon all hope of controlling these systems. It does mean we have to switch strategies. The next episodes lay the groundwork by exploring the hidden order of complexity.
Summarizing complexity theory is a daunting undertaking. Still, we’ll need some background to understand how complexity and systems theory can be applied to companies. The field of systems thinking sits at this crossroads and is a good place to start.
“Wicked problems result from a mismatch with how things work and how we think or perceive they work.”
At the end of the day, we want to understand an organization’s ability to deal with uncertainty and change. Why are some organizations better able to adapt than others? Arie De Geus, a pioneer of systems thinking and a global executive at Shell in the 1980s, said that growth in a changing environment depends on institutional learning, which is the process whereby executive teams update the shared mental models of their organization, their markets, and their competitors.
The discipline of systems thinking attempts to help people construct these mental models. Lesser leaders reduce complex systems to individual components and events. They do not fully grasp how the different components interact. Systems thinkers, on the other hand, look for what lies beneath the surface:
Events: what happened
Patterns: what has been happening (finding trends)
Structures: why has it been happening (finding relationships between trends and events)
Mental models: how do we think about what is happening (finding blind spots, assumptions, and beliefs)
The key to systems thinking is to appreciate any complex system as an interconnected whole. The components of the whole - the companies in the supply chain, the organs in the body, or the employees in a company - are interconnected and influence the behavior and emergent properties of the system.
These components are not just connected but also interdependent: each part of the system depends on some other part for its effect on the whole. In the example of the human body: your brains need your heart and lungs to function and vice versa.
In commerce, every finished product is the result of countless people and companies collaborating in a complex supply chain system. To appreciate just how complex, it is worth reading “I, Pencil”, a delightful story about the coordination required to manufacture the humble pencil.
For those of us with shorter attention spans, Milton Friedman makes the point in a few minutes:
When there is interconnectedness and emergence in play, it is dangerous to study parts in isolation. What looks like a heart problem may actually be caused by another organ, or it may cause symptoms elsewhere in the system. The genius factor of fictional characters such as Sherlock Holmes or Dr. Gregory House is little more than a knack for systems thinking.
Reductionism - one of the tenets of the clockwork worldview and of Taylorism - is the opposite of systems thinking. If you take a complex system apart, it loses its emergent properties. Draper Kauffman stated it colorfully:
“…dividing the cow in half does not give you two smaller cows. You may end up with a lot of hamburger, but the essential nature of “cow” — a living system capable, among other things, of turning grass into milk — then would be lost. This is what we mean when we say a system functions as a “whole”. Its behavior depends on its entire structure and not just on adding up the behavior of its different pieces.”
— Draper Kauffman
In some cases, it can make sense to examine components of a system in isolation but not if your goal is to understand the system. There is an ancient Indian tale that illustrates the importance of perspective, context, and seeing the whole. Here is a summary of the parable:
Six blind men were asked to describe an elephant. The first man, feeling the elephant’s side, said the elephant was like a wall. The second man, holding the tusk, said the elephant was like a spear. The third man, feeling the elephant’s trunk, said the elephant was like a snake. The fourth man, feeling the elephant’s leg, said the elephant was like a tree. The fifth man, feeling the elephant’s ear, said the elephant was like a fan. The sixth man, feeling the elephant’s tail, said the elephant was like a rope. All six men were correct, but each one only knew part of the truth.
— Indian parable summarized by GPT3
In more recent history, the MIT Sloan School of Management has developed the beer game: a role-playing simulation game to illustrate some principles of systems thinking. On the surface, the game lets students live through typical coordination problems of supply chains. If you dig deeper, you will find that the game does more than that. Players experience firsthand that their actions can have unintended consequences and that their behaviors are predicated on how they perceive the system.
The game emulates a simple production and distribution system for a line of beer where players take up the role of a retailer, a wholesaler, or a brewery. During the game, players experience firsthand how small variations in initial demand cause greater and greater fluctuations as they reverberate down the supply chain.
This is because people in the retailer role are likely to misinterpret an increase in demand as an indication of future demand, leading them to place higher orders in anticipation of this expected rise. This signal can get amplified when wholesalers jump to a similar conclusion, ultimately causing the brewers to ramp up production. When the anticipated additional sales don’t materialize, this leads to excess inventory across the board. This is known as the bullwhip effect:
The game never fails to frustrate players who reliably shift blame to other parts of the chain or “erratic” customer demand. A key lesson is that we are a part of these systems. Our limited understanding of the whole chain causes the overreaction that brings the system out of balance.
Various systemic solutions have been conceived (shorter supply chains, lower lead times, more collaboration, inventory visualization across the supply chain...), but the bullwhip effect has not been eradicated. We need only look at supply chain disruptions following the Covid pandemic to see it in action today.
People learn quickly if cause and effect are closely related in time and space. The child that gets burnt has no problem identifying the link between the stove and the pain in their hand. In complex systems, the distance between cause and effect is greater, making it harder to connect the dots between events.
When the connection between cause and effect is not linear - and it rarely is in complex systems - leaders should be modest in the policies they install. Unfortunately, the managerial class often selects for overly confident people, and many policies fall victim to the law of unintended consequences.
Once you start examining the world through a systems thinking lens, it becomes clear that an overwhelming majority of policy is little more than an attempt to fix the unintended consequences of earlier policies. Here is an example of a government wrestling with unintended consequences (in Dutch).
If our understanding of the system is too narrow and flawed, we cannot grasp the implications of our actions. When we think about the effects of our actions, we tend to divorce the intended or foreseen effects from the unintended consequences, the so-called side effects. What we call side effects is not an aspect of reality; it just demonstrates that our understanding of the system is incomplete!
So this brings us to a new question. How can we get a better understanding of interactions in a complex system?
Further reading
Peter Senge, The Fifth Discipline