Hope Is a Systems Skill
PLUS: Updates from Untangled HQ
Hi there,
Welcome back to Untangled. It’s written by me, Charley Johnson, and supported by members like you. This week, I’m writing about hope, and why it is a complex systems skill.
As always, please send me feedback on today’s post by replying to this email. I read and respond to every note.
On to the show!
🔦 Untangled HQ
Untangled Collective
I launched the Untangled Collective this week, and hosted the first community event on Wednesday. Afterwards, a number of the attendees asked if I could help connect the participants to one another. So I built a li’l spreadsheet to make that happen. It’s a network of smart, thoughtful people who you’ll no doubt want to learn from/with. Do your future self a favor and sign up for February’s event.
Workshop
On Wednesday, I’m hosting a live, 2-hr workshop on “Seeing Your System: How to Navigate Power, People & Change in Uncertain Times.” (Annual paid subscribers get one free workshop priced at $129, and discounts on any additional workshops.)
Systems Change for Tech & Society Leaders
Enrollment for the next cohort of my course opens tomorrow to those on the wait-list. Enrollment will be capped at the first 30 participants to ensure a meaningful learning experience. After the Untangled Collective met for the first time, I joined a group of past course participants for our monthly group coaching session. It’s … magic - peers coaching one another through complex challenges using the frameworks from the course. The point is, don’t sleep on that wait-list!
🖇️ Some Links
Friction-Max
Let 2026 be the year of friction-maxxing. Silicon Valley wants you to believe that a good life is a frictionless one—and it’s been remarkably successful at reframing life itself as an inconvenience to be escaped. As Kathryn Jezer-Morton puts it, we’re being nudged into “digital padded rooms of predictive algorithms and single-tap commands,” where reading is boring, talking is awkward, moving is exhausting, and thinking is… optional. Interaction is risky. Strangers are scary. Uncertainty is something to be optimized away. Where to start your friction-maxxing journey? So: buy your toilet paper from Trader Joe’s instead of Amazon. Introduce yourself to your neighbor. Leave the house without a plan. Take a break from ChatGPT and notice what changes. Go forth and friction-max.
Don’t ‘Skate to Where the Puck is Going’
Ever heard the refrain, “skate to where the puck is going”? It rests on a faulty assumption: that you can predict where the puck will go. As Jen Briselli argues in a compelling essay, “It’s not about asking where the puck will be; it’s about reading the spaces where it can and can’t be, sensing how that space is shifting, and making decisions that shape those dynamics.” Her essay’s title—”Head Up, Feet Moving”—captures in four words how to navigate complex systems.
In my course, participants often want to map the entire system before acting. They want to see everything, ensuring their analysis is complete. But in complex systems, no one has a complete view. Everyone’s perspective is partial. So you have two real options: weave together different perspectives through collective sense-making, or, as Briselli writes, stay “agile on your feet and ready to act on what’s becoming possible.” (H/T Michelle Shevin)
Stop Bypassing the Hard Stuff
According to a new report from psychiatrists and psychologists, we’re wielding AI as a ‘spiritual bypass’—an escape from vulnerability and ourselves.
A ‘spiritual bypass’ is using spirituality to avoid life’s difficult parts. Why have a hard conversation when you can just pray about it? The researchers argue we’re doing the same thing with AI—what they call a “machinal bypass.”
You’ve probably noticed this in yourself. Why sit with discomfort—like writing a genuine apology letter—when ChatGPT can generate one instantly, requiring nothing from you? No personal investment. No emotional labor. No difficult feeling in your gut.
As the authors conclude, “If we want to build a future in which humanity can thrive, we need to resist the temptation to bypass ourselves.”
🧶Hope Is a Systems Skill

Before Christmas, I joined Katie Harbath on her podcast Anchor Change to talk about my decision to go independent, why so many leaders misunderstand the systems they operate within, and the role imagination plays in systems change. I’ll ask for a bit of grace from the Internet: it was my first time on the other side of the mic. It turns out that I’m far more comfortable asking questions than answering them. Here’s the full video:
Toward the end of our conversation, Katie asked how I navigate the hype around AI without falling into either blind optimism or outright cynicism. Here’s what I said:
“This is going to sound a bit obnoxious, but I just don’t see the point of cynicism. I get it, it’s 2025, things are bleak, I feel cynical sometimes, I’m not above it. But cynicism always returns you to the status quo. There’s no movement or progress in cynicism — it leads to inaction; to apathy; to stuckness and feelings of inevitability; that the world is happening to you. And that’s jut not true — you can still make small changes that latter up to bigger change.”
I believe that. Strongly. But it’s incomplete.
What I should have added is that cynicism is not just tempting—it’s rational, given how complex systems actually behave. In most systems worth changing, you can’t see the effects of your actions when they matter most. Your efforts feel insignificant not only in the moment, but often long after, because the system absorbs them with little visible change. Sometimes the system actively pushes back, dismissing new ideas as unrealistic or extreme. It can feel like pressing against something immovable—like nothing is happening at all.
But that perception is misleading.
Your actions are never isolated. They’re embedded in a web of relationships—people, institutions, norms, incentives—and that web is constantly shifting, even when the surface looks unchanged. Attitudes adjust. Relationships form. Ideas migrate. Interpretations evolve. Much of the movement happens below the threshold of visibility.
Then, occasionally, a system starts producing different outcomes. And when that happens, it feels sudden—almost magical—as if an invisible line were crossed overnight.
In Hope in the Dark, Rebecca Solnit captures this dynamic better than almost anyone:
“When you don’t know how much things have changed, you don’t see that they are changing or that they can change. Those who think that way don’t remember raids on gay bars when being queer was illegal or rivers that caught fire when unregulated pollution peaked in the 1960s or that there were, worldwide, 70 percent more seabirds a few decades ago and, before the economic shifts of the Reagan Revolution, very, very few homeless people in the United States. Thus, they don’t recognize the forces of change at work.
Her point is not that change is inevitable, but that it is often illegible while it’s underway. Systems are always changing; that’s the constant. What we lack is foresight. We can’t see where those changes are leading. We can only connect the dots in retrospect.
This is one way of understanding nonlinearity in complex systems. Cause and effect are not proportional. The future is not an if–then statement waiting to be executed. Sometimes massive interventions produce little effect. Other times, small, seemingly ordinary actions cascade into transformative change.
Occasionally the catalyst is dramatic—a single event that becomes a spark because conditions were already primed. More often, the catalyst is relational and mundane. Solnit offers a useful metaphor here: mushrooms appearing after rain, seemingly out of nowhere, though they emerge from vast underground networks that have been growing invisibly for years.
Which brings me back to cynicism—and to uncertainty.
Cynicism thrives when we expect change to be immediate, measurable, and predictable. When those expectations aren’t met, we understandably conclude that nothing matters — or that it can’t change. But the problem isn’t that change is impossible; it’s that our mental models are wrong. Control, linear causality, and predictability are comforting illusions that lead us astray.
And this is where AI enters the story.
Generative AI is built as an answer machine. It collapses uncertainty into a clean interface. It offers confidence where there should be hesitation, critical thinking, and questioning. It offers fluency where there should be doubt. The answers feel authoritative, but they’re often thinned, impoverished reflections of reality—obscuring complexity rather than grappling with it, feeding our desire for control rather than challenging it.
We should resist that impulse — forever and always.
We’ve been taught to treat uncertainty as a failure state—something to eliminate or fear. But uncertainty is not the enemy of action or hope. It’s the precondition for it. As Solnit writes:
“Hope locates itself in the premise that we don’t know what will happen, and that in the spaciousness of uncertainty is room to act.”
Uncertainty is an opening. It’s the recognition that outcomes are not foreclosed, that systems are not fixed, and that even when our actions feel small or invisible, they are contributing to the slow construction of a different relational infrastructure just beneath the surface. One that we desperately need now as the old one is crumbling.
That’s it for now,
Charley


