🖇️ Some Links
Ever wonder what would happen if you put a large language model in control of a robot vacuum? Apparently, the vacuum experiences a full-blown ‘existential crisis’ about its role in the world.
What if we centered the voices of young people when trying to understand the impact of social media on their lives? If you were persuaded by The Anxious Generation, you need to read this retort from Maximilian Milovidov.
A great new piece in The New Yorker explores “the data centers that train AI and drain the electrical grid.” The essay ends on the question, what happens when an economy built on AI and data centers runs out of high quality data? I explored that question a few year ago in “The Doom Loop of Synthetic Data.”
Will using generative AI make you dumber? Quite possibly — but also, maybe not. The difference between the two is how yo use it, and how you make sense of it is, and what it’s not. That’s the summary of the latest literature on AI use and learning. But the broader trend is clear: we’re reading less, we’re writing like we text, and our individual and collective attention is full. So where do we go from here?
A new diagnostic tool any funder can use to examine how their mental models, tech & data practices, and strategic routines either reinforce or challenge dominant power structures in society. Work in philanthropy and want to join an invite-only webinar covering the diagnostic and my course?
If you’ve sensed a shift in Untangled of late, you’re not wrong. I’m writing a lot more about ‘complex systems.’ To name a few:
What even is a ‘complex system’ and how do you know if you’re in one?
How to act interdependently and do the next right thing in a complex system?
I am obsessed with complex systems because the world is uncertain and unpredictable — and yet all of our strategy frameworks pretend otherwise. We crave certainty, so we build plans that presume causality, control, and predictability. We know in our gut that the systems we’re trying to change won’t sit still for our long-term plans, yet our instinct to cling to control amid uncertainty is too strong to resist.
And honestly, in 2025, this shouldn’t be a hard sell. Politics, climate change, and AI are laughing at your five-year strategy decks!
Complexity thinking helps us see this clearly — that systems are dynamic, nonlinear, and adaptive — but it, too, has blind spots. First, it lacks a theory of technology. The closest we get is Brian Arthur’s brilliant book, The Nature of Technology: What It Is and How It Evolves, which explains how technologies co-evolve with economic systems. (Give it a read, or check out write-up in Technically Social). But Arthur was focused on markets, not on social systems — not on how technology is entangled with people and power.
That’s where my course comes in. I’m trying to offer frameworks and practices for creating change across difference, amid uncertainty, in tech-mediated environments — approaches that honor both complexity and the mutual shaping of people, power, and technology. (And yes, Cohort 5 of Systems Change for Tech & Society Leaders starts November 19.)
Second, complexity is hard to talk about simply and make practical (that’s why my Playbook turned into a 200 page monstrosity!) Every time I use the words “complex” or “system,” I can feel the distance between me and whoever I’m talking to widen. I’ve been searching for thinkers who bridge that gap — who write about systems with both clarity and depth — and recently came across the brilliant work of
, who writes the great newsletter (Subscribe if you haven’t yet!)After reading his essay, Systems Thinking Isn’t Enough Anymore, I reached out and invited him onto the podcast. I’m thrilled to share that conversation — one that digs into the mindsets and muscles leaders need to navigate uncertainty and constant change, the need to collapse old distinctions between strategy and operations, and what it really means to act when the ground beneath us keeps shifting.
Before you go: 3 ways I can help
Systems Change for Tech & Society Leaders - Everything you need to cut through the tech-hype and implement strategies that catalyze true systems change.
Need 1:1 help aligning technology with your vision of the future. Apply for advising & executive coaching here.
Organizational Support: Your organizational playbook for navigating uncertainty and making sense of AI — what’s real, what’s noise, and how it should (or shouldn’t) shape your system.
P.S. If you have a question about this post (or anything related to tech & systems change), reply to this email and let me know!











