How hot is your data?
'Hallucinations,' AI winters, and 'narcissistic-sociopathic' tech studies.
✨Weekly Reading
OpenAI now agrees (!) that ‘hallucinations’ are a feature, not a bug of LLMs. I made that argument in a 2023 essay titled “AI isn’t ‘hallucinating.’ We are.” I got a lot of blowback at the time — that I wasn’t ‘technical enough’ to get it — so I suppose this is my moment to be righteous on the internet. Kidding. The point is, the frames we use matter — they create shared meaning, shape how we talk about a problem and what we do about it, and hide power. Read the full essay to understand why ‘hallucination’ has always been a problematic frame.
Jeremy Khan explores past ‘AI winters’ to determine if we’re headed toward one now. The upshot? Maybe! The key factors to watch: research highlighting the limitations of a particular method or technique, the pace of real-world applications not meeting expectations, and frustration among funders at the pace of development. From my perspective, we’re seeing signs of one and two — but investors aren’t (yet) frustrated because they’ve bought the scale-at-all-costs narrative of AI development.
STS scholar Lee Vinsel wrote a much needed essay with the punchy title: “Against Narcissistic-Sociopathic Technology Studies, or Why Do People USE Technologies?” Vinsel is challenging scholars to be curious about why and how people use generative AI. This doesn’t mean researchers should stop being critical — it means they need to look behind their own judgments and study first and foremost how and why people use these technologies. As Vinsel writes, “With narcissism, we can only see the world through our own perspective; with sociopathy, we come to believe that we know what is going on in other people's minds better than they do without asking or otherwise studying them.” I’m going to have Lee on the podcast to talk about it — stay tuned!
In a great essay, Aarn Wennekers argues that “systems thinking” and “complexity thinking” are too often untethered from strategy, operations and decision making. I couldn’t agree more! I find the frameworks incredibly powerful yet way too abstract to be useful. That’s why I created an interactive course and a 100 page workbook to help people apply the frameworks in their working context. Cohort 4 is launching soon. Join me?
How hot is your data?

Last week, I wrote about how demographics, statistics, and surveys strip data of context and relationships, often obscuring the interactions that produced them. So what might it look like to recenter data in context and systems? One answer comes from Nora Bateson’s idea of “warm data.”
Bateson defines warm data as “trans-contextual information about the interrelationships that integrate a complex system.” Put simply: instead of isolating variables, warm data asks us to see how things connect, and why. Basically, the scientific method says ‘Hey, let’s break down the world into isolated, measurable variables, study them, and assume that those parts explain the whole.’ But people who think in complex systems might retort ‘Hold up, you can’t separate data from its context, a lot of things that aren’t measurable really matter, and the whole is often different from the sum of its parts.’ Traditional data methods aren’t wrong — they’re essential in many fields — but by focusing on what is easily measured, they risk missing what really matters in complex systems: relationships, interdependencies, and feedbacks.
Take predictive policing. As I’ve written about in the past, algorithms trained on arrest data treat those records as if they were indicators of crime. But arrest data don’t tell us where crime happens — they tell us where police arrest people. This could represent a ‘crime',’ but it could also reflect a community that is being overpoliced. Warm data’ would center the context and dynamics that contributed to the interaction that created the data. For example, we know that in Ferguson, Missouri, this was shaped by institutional incentives: as the Department of Justice revealed, city officials explicitly pushed police to boost municipal revenue through fines and fees. From 2011 to 2012, revenue from those fines increased by 33%. Arrest data, then, reflected the feedback loop between finance and policing that disproportionately harmed citizens.
Warm data would push us to look beyond the numbers — to ask how incentives, institutional structures, and social dynamics shape the interactions that produce data in the first place. That’s not just an analytical flourish; it’s essential to making better policy and designing better programs. If we miss those dynamics, interventions risk reinforcing the very problems they’re meant to solve.
Bateson outlines a few practical ways into this work:
“Multiple description”: look from multiple perspectives to understand context and interdependency.
“Pattern recognition”: compare patterns across systems to highlight hidden roles of context.
“Zooming in and out”: toggle between reductionist analysis and holistic synthesis.
Warm data isn’t about rejecting quantitative analysis or abandoning comparability — institutions still need those. It’s about holding them alongside contextual, relational insight so we can design smarter interventions. Without that, we risk making decisions based on abstractions that ignore the systems producing the data in the first place.
“The real voyage of discovery consists not in seeing new sights, but in looking with new eyes” - Marcel Proust