Intelligence is not the same thing as power
The TikTokification of social media, a very online Pope, and the decline of Google search.

Intelligence is not the same thing as power.
Every week, it seems, OpenAI, Google, Meta, etc. release a new generative AI model with new capabilities.
This trip-wires a flood of media coverage about when ‘artificial general intelligence’ or AGI will occur, when these systems will reach ‘super intelligence,’ etc.
Then there is the backlash to the flood. Scholars and researchers critique the hype — arguing that these systems aren’t intelligent in any meaningful sense; that they don’t actually understand or reason; that ‘AGI’ is a marketing term, etc. I get sucked into this myself.
But hiding beneath this back-and-forth is a false assumption: an increase in capabilities does not lead to more power. As
and write in their great new essay “AI as Normal Technology,” power is “the ability to modify one’s environment.” We’re mistaking capabilities for control! Or as Narayanan and Kapoor explain, “We are powerful not because of our intelligence, but because of the technology we use to increase our capabilities.”This false assumption is what allows marketing terms like AGI to take hold. It’s what enables existential risk narratives to spread. It’s what enables profit-maximizing companies to convince us that that only they can prevent these outcomes; that only they can make ‘AI safe.’
If intelligence equaled control and power, then sure, very capable machines would be scary. But it doesn’t. To put a point on it: I might be smarter than President Trump, but he has access to the nuclear codes, while I try real hard to get people to click a li’l button.
The point is, when algorithmic systems increase their capabilities, they don’t get more powerful, and we don’t lose more control over our ability to shape the future. But when we fail to uphold this distinction, we give up our own agency.
We hand over our power to a computational system.
We hand over our power to a company and the short-term financial incentives that guide its behavior.
We let their marketing become our future.
Don’t believe them.
Don’t let them take your power.
Free Workshop
I’m hosting a free, interactive, one-hour workshop on the social construction of data on June 5 at 3pm PT. You will learn:
How to see data as socially constructed.
How to apply it in your working context and map its implications to your strategy.
What it means for how we should think about AI.
🔗 What I’m reading this weekend
TikTok is altering the fundamental structure of social media. The dominant organizing principal is no longer interpersonal connections. Instead, we’re being clustered by shared interests. This structure is both less personal and more opaque. (More) Everything you want to know about the power and limits of social media transparency. (Untangled Podcast)
Google searches are falling for the first time ever (More). The reason? We’re increasingly turning to chatbots to answer our queries. There are profound social implications for this shift from a search world to one defined by a direct answer (Untangled Deep Dive).
Meta thinks the total value of 7 million books is roughly $0. (More)
Large language models don’t simply have social and cultural implications. They are themselves social and cultural technologies. As Henry Farrell et al. write, “Large Models should not be viewed primarily as intelligent agents, but as a new kind of cultural and social technology, allowing humans to take advantage of information other humans have accumulated.” (More) Accumulated and produced! Right, these models are trained on our data. Everything you need to know about what it means to ‘train AI.’ (Untangled Deep Dive).
The first very online Pope. (More)
The U.S. government’s antitrust case against Google search is winding down. (More) Read what the judge should do to break the cycle that enabled Google’s dominance. (Untangled)
“Remember to imagine and craft the worlds you cannot live without, just as you dismantle the ones you cannot live within.” Ruha Benjamin, Professor, Princeton University
Why Untangled? Because there is no such thing as a ‘tech problem.’ All ‘tech problems’ are entangled in systems structured by power and inequality. If we don’t untangle the two, we perpetuate the status quo in the name of innovation and progress. My job is to help you untangle your system, and teach you the strategies, skills, and tools to change it.