Goodbye Surveys, Hello Nudges: The New Politics of Being Seen
PLUS: The disastrous Google ruling, Meta's rift with Scale AI, and why you're futures exercise misses the mark.
đ Weekly Reads
Judge Amit P. Mehtaâs ruling in the Google search case will entrench its dominance. When I wrote about this case in April, I argued that the only way to create a level playing field in search would be to end the privileged relationships that enable Googleâs flywheel. The remedy issued by Judge Mehta requires Google to share search results data with certain competitors, but it does nothing to prevent Google from paying distributors like Apple or Samsung to be their default search browser or preinstall Gemini as their operating AI.
What Metaâs rift with Scale AI says about the value of data: In June, Meta invested $14.3 billion in Scale AI, a data labeling vendor. Scale AI pays people around the world low wages to annotate training data, and then it sells that data to AI companies. This investment was rumored an âacquihireâ at the time â Meta wanted Scale AIâs talent, not its assets, the story went. Now, a number of key hires have left the company, and researchers in Metaâs âSuper Intelligenceâ Lab are frustrated by the quality of Scale AIâs data. Right, as tech companies enter highly skilled fields like medicine and the law, they require domain experts to serve as a quality check on the data â and that requires an entirely different model. Untangled Deep Dive on what means to train AI from a sociotechnical lens.
The âCalifornian Ideologyâ is back, baby! Scholar Nathan Schneider argues that the famed ideology that amounts to âlet tech and the market take the wheelâ is alive and well, thirty years after gaining popularity. In a stirring essay, Schneider writes, âDemocracy has no air left to breathe. Individuals can choose products, and companies can choose to produce them. But collectively, as a society, we are at the mercy of whatever innovations the market sees fit to deliver; that is the future, full stop.â Schneiderâs point is that technology has become a stand in for political discourse and policymaking, and I couldnât agree more. Weâve entered peak techno-determinism. Untangled Deep Dive on the techno-determinist mindset.
Strategies must integrate the past, present, future, and mission of the organization. Picking one horizon misunderstands how the past and future can impact the organizationâs present as I wrote in âWhatâs keeping you stuck?â As
writes in a great new essay, strategy âmust be designed not as a sequence but as a simultaneity, where every action is judged across past, present, future, and mission at once.â Joharâs essay also highlights the limitations of traditional futures exercises â as Iâve observed in my own facilitation practice, if they donât address sticky past dynamics or account for present realities, they become untethered from reality. A fun, escapist exercise that creates the sense of movement and progress but does nothing to change the underlying behaviors or dynamics keeping the organization in place.
AI is accelerating. But the human response is still catching up.
The Artificiality Summit (Oct 23â25) is where we do that catching upâtogether.
With only 150 participants, this isnât a passive experience. Itâs a shared act of imagination. And weâre nearly full.
If youâve been meaning to join, nowâs the time.
Letâs design a more human future for AI.
Join the conversation with people like:
Benjamin Bratton, Antikythera
John Pasmore, Latimer.ai
Eric Schwitzgebel, University of California
Jamer Hunt, Parsons School of Design
Use promo code âUNTANGLEDâ for 10% off the ticket price and subscribe to the Artificiality Instituteâs writing and podcasts about the human experience of AI.
How do platforms see?
James Scott famously explained why the state needed to contort the complexity of the modern world into standardized boxes: it needed to see us to shape us. The state had a blindspot, and to make citizens legible and manageable, it created modern statistics, demographics, and, of course, the survey. Scottâs Seeing Like a State showed that how we're counted, measured, and represented creates an epistemology of power. In a great new book, Petter Tornberg and Justus Uitermark argue that platforms see quite differently than states, with profound implications for how power functions in society, and what we can do about it.
Statistics and demographics â the state's preferred way of seeing â rely on fixed categories and individual data points stripped from their context, relationships, and the interactions that produced them. As Tornberg and Uitermark explain, "Scott's state looks down on its population from above, imposing grids and straight lines as seen from the map-makers view." This approach was perfectly suited for the modern industrial era. As the authors explain, Ford factories organized their operations through accounting, measurement, and statistics, collaborating with technology companies like IBM to optimize productivity. This methodology turned companies into machine-like entities, which led to viewing and modeling systems as if they were complicated. In short, the state's need to see and the resulting data led us to treat systems â companies, organizations, schools, hospitals, and even family life â as ordered, predictable, and controllable entities.
Platforms see differently â and in turn, wield power differently â because they manufacture different data. As Tornberg and Uitermark explain:
âWhile demographics and statistics erased relations and sought to categorize and classify individuals, the data collected by digital platforms are relational, interactive, heterogeneous, interactional, and emergent. While statistical data was collected periodically, giving a snapshot of a defined population, digital data constitute continuous flows, feeding algorithms that redefine clusters and patterns and seek to modulate their behavior.â
As scholars danah boyd and Kate Crawford have argued, this shift has created "a profound change at the level of epistemology." Power no longer stems from the state imposing its top-down view of the world, but emerges from the bottom-upâfrom platforms fine-tuning their algorithms to engineer specific outcomes at mass scale through what Tornberg and Uitermark call "programmable social infrastructure." This creates not a complicated system but a complex one. You're not being forced into a liâl standardized box, measured, and optimized for productivity based on control. Instead, you're being shaped by a continuous, ever-adapting flow of relational data, steered by platform nudges. How we organize ourselves doesnât result from a central-control mechanism but emerges from the platformâs programming.
This might sound wonky or abstract, but how power functions determines what you can do about it. By seeing a complicated system that mirrors a machine, you abdicate your own agency â you are âreduced to components of a larger system,â as Tornberg and Uitermark put it. But when you see a system as complex â and more than that, when you see how technology isnât deterministic but entangled in social systems â you can start to reclaim your agency by identifying what Donella Meadows calls âleverage points.â These offer a helping starting point, but ultimately, the only way to escape the power of âprogrammable social infrastructureâ is to build a new kind of relational infrastructure; to stitch together diverse networks and change the flow of information and resources in a way that reorganizes problems, and cultivates a shared way of seeing.




Interesting idea - I think - but needs an example to illustrate how an individual might fare under the two systems.