This is a moment for collective action and proactively building toward alternative futures. Tech power continues to concentrate. Unbridled technologies continue to shape our social and political lives, and exacerbate existing inequities. All the while, narratives of ‘AI innovation’ and ‘AI acceleration’ are crowding out stories of real harm, happening right now. Those with power offer only a narrow set of potential AI futures, foreclosing alternative paths.
The path from the present moment to radically different futures isn’t clear. When everything feels inevitable and the next step isn’t obvious, we need guides. Guides that help us identify hidden power dynamics in a system. Guides that help us anticipate technology’s impact on disparate communities. Guides that help us critically analyze data and technology. Guides that help us develop strategies that maps backward from the future we want to create.
Today, I’m launching Untangled Guides — a new ‘how to’ offering for paid subscribers that will help you interrogate the system you’re trying to change, and align your actions, decisions, and strategies to the future you want to create. They will be short, sweet, and chockfull of actionable insights and practical approaches.
Here are a few I have in the hopper:
How to critically analyze technology
How to identify the frames and metaphors that hide power
How to anticipate technological shifts in power
How to anticipate technology’s impact on different communities
How to take interdependent (not independent!) action
How to think and act ecologically
How to anticipate different community uses of technology
How to anticipate technology’s impact on culture
How to avoid the ‘scale’ trap
How to avoid the ‘efficiency’ trap
How to avoid the ‘transparency trap
How to envision alternative futures
How to map and measure backwards, not forwards
How to map complex systems and anticipate emergent dynamics
How to set directional goals, not targets and outcomes.
The first guide launches today: How to critically analyze data. Let’s get started, shall we?
Overview
Data are made by you and me. We interact and transact with one another. We do things out in the world. We engage with institutions. We click and scroll online. Right, data are never raw, nor are they neutral or objective. They’re descriptive and diagnostic of the practices and behaviors of people interacting with one another and institutions at a moment in time. To put a pin on it, data aren’t predictive, they are socially constructed and historically content.
This matters because when we accept the premise that historical data can somehow predict the future — and all AI systems accept such a premise — we encode historical social biases in the present, and bring them with us into the future. We become trapped in a feedback loop of our own making, exacerbating existing inequities along the way.
I started with this Guide because step one of imagining an alternative future is reconsidering what data can tell us, and what it can’t.
E.g
Take policing in the U.S. So-called ‘predictive-policing’ tools rely on arrest data, and other ‘risk scores’ to train the underlying algorithm. But ‘arrest data’ do not denote crime. Arrest data might instead suggest that a community is being over-policed. Right, the data aren’t ‘predictive,’ they’re descriptive and diagnostic of the practices and behaviors of police departments, and how they interact with communities.
Moreover, people can be considered ‘risky ‘even if they have never been arrested before. For example, the Chicago Police Department developed something called the ‘Strategic Subject List’ to algorithmically predict the likelihood that an individual is at risk of becoming a victim or an offender in a shooting or homicide. But an evaluation of the tool found that more than one-third of individuals on the list have never been arrested or a victim of a crime, and almost 70% of that cohort received a high-risk score. In other words, in an attempt to ‘predict risk,’ the tool actually manufactured it by encouraging an encounter with the police.
At the same time, some interactions don’t become data at all. One thing that’s weird about these tools is that they don’t include data for predominantly white-collar crimes. Historically, while social scientists conflated blackness with criminality, white criminality was explained away by structural inequities and poverty. As Khalil Gibran Muhammad documents in The Condemnation of Blackness, in the Progressive era, researchers described whites and immigrants as “a great army of unfortunates” driven “to madness, crime, or suicide” by an inequitable economic system and class oppression. As a result, researchers and reformers advocated for economic solutions (e.g. higher paying jobs, brand new interstate highways leading to white suburbs, etc.) to the problem of white crime. No need to predict it if it’s not considered to be a ‘real crime’!
One can see this legacy today. For example, tax evasion data and wage theft data aren’t often included in ‘crime statistics’ because we don’t consider these white-collar crimes part of ‘the problem’ of crime itself. The point is, data don’t speak for themselves -- what gets recorded as ‘data’ isn’t self-evident or objective, it’s the output of a contestation of beliefs, norms, and power.
Resources
Approaches
Untangle data - Interrogate how data are made, and situated in a specific social and cultural context. Ask questions like:
What transactions --interactions, behaviors, actions, etc. -- do the data map back to?
How might the recorded data represent the resolution of a conflict or contestation over what should / shouldn’t count?
What -- in the relevant transaction -- didn’t become data? What wasn’t counted?
How might those transactions be shaped by societal systems like race, gender, and power?
How might those transactions be shaped by policies, institutional practices, and economic incentives?
How might those transactions be shaped by an individual’s context?
Enjoy the first Guide? Send me a quick note and let me know.