Untangled with Charley Johnson

Untangled with Charley Johnson

Share this post

Untangled with Charley Johnson
Untangled with Charley Johnson
How to define your system
⚒️ System Guides

How to define your system

A guide for identifying clear boundaries, and avoiding 'unintended consequences.'

Charley Johnson's avatar
Charley Johnson
May 22, 2025
∙ Paid
1

Share this post

Untangled with Charley Johnson
Untangled with Charley Johnson
How to define your system
2
Share
Elise Racine & The Bigger Picture / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/

This issue is part of a series of practical guides. They offer you the lenses to see your system clearly, and the levers to collectively change it. The series includes:

  • Guide 1: How to see data as socially constructed

  • Guide 2: How to see technology as socially constructed.

  • Guide 3: How to analyze the frames and metaphors that hide power in technology

  • Guide 4: How to anticipate technological shifts in power.

  • Guide 5: How to anticipate technology’s impact on different communities.

  • Guide 6: How to see your system clearly.

  • Guide 7: How to anticipate different community uses of technology.

  • Guide 8: How to take interdependent (not independent) action.

  • Guide 9: How to leverage feedback loops in your system.

  • Guide 10: How to think & act ecologically.

  • Guide 11: How to map backward from the future.

Today, I’m launching Guide #12: How to define your system. Drawing boundaries around a system — defining it! — is really hard.

Sometimes, we’re too broad — everything is connected to everything! But the bigger risk is in being too narrow. The team repeatedly misses its goals, so we focus on the team, when it turns out, the underlying reason they continue to miss their goals is connected to organizational structures, how decision rights are allocated, what is rewarded and given status within the organization, etc. Or maybe the system extends beyond the organization because the team dynamic is actually influenced by a partner relationship or funder requirement.

When we define the system too narrowly, and then intervene, it surprises us — and we call these ‘unintended consequences’! But the deeper truth is that we inserted artificial boundaries and didn’t see the complete picture.

The ‘boundaries’ we insert often reflect our position in the organization, our training, and our life experience. For example, say an algorithmic decision making system repeatedly harms the same communities. In response, product teams and engineers might work together in an attempt to design the technology in a way that is more ‘responsible’ or ‘fair.’ This narrow focus on the technology itself is understandable given the role and training of engineers and product managers.

Moreover, it’s not totally insane to consider fairness a reasonable goal if you’re a white guy who believes in meritocracy. If unfairness is simply the result of “fallible human biases on the one hand, and imperfect statistical procedures, on the other” as I wrote in “Beyond Minimizing Harms” (Untangled Deep Dive), then it’s plausible to make meaningful change by focusing on the algorithmic system. Together, they’ll draw a nice li’l boundary around the technology and the process of creating it, and then try to make it fairer.

Keep reading with a 7-day free trial

Subscribe to Untangled with Charley Johnson to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Charley Johnson
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share