How to critically analyze technology
A guide to anticipating the effects of AI and other emerging technologies
Guides are a new ‘how to’ offering for paid subscribers that will help you interrogate the system you’re trying to change, and align your actions, decisions, and strategies to the future you want to create. Each Guide includes an overview, an example to bring the Guide to life, resources to dive deeper, and approaches for applying the Guide in your working context.
Guide #1: How to critically analyze data. If you want to understand AI — what it is, what it can say, and what it can’t — and anticipate its impacts, then you need to understand data, and how they are made.
On to Guide #2: How to critically analyze technology.
Overview
Technologies are shaped by social systems like race and gender, and those technologies in turn shape social systems. This is what’s meant by ‘sociotechnical’ — that we can’t view social and technical systems separately. Critically analyzing technology, then, means analyzing the dynamics at the center of this entanglement, and the stakeholders implicated in creating and changing them.
The way technologies shape people is through ‘affordances.’ You can think of an ‘affordance’ as how technology alters our experience of it -- the way they shape how our bodies are represented, where we focus (or don’t!) our attention, the actions we can (or can’t!) take, the decisions we make, etc. Spend a minute listing all of the ways ChatGPT or an iPhone alters your experience of it (e.g. the color of the applications is known to be addicting) and then map backward to consider why that affordance might exist. The quick (and often correct) answer to this exercise is financial incentives, but that’s not always the case.
Technologies are designed by someone, somewhere, whose beliefs, normative commitments, and operating contexts subtly shape its development. This matters because how technologies are designed and their affect on the world aren’t inevitable or pre-determined. For example, the internet of today is nothing at all like the internet of the 1990s, which emphasized affordances like anonymity and ephemerality. What changed wasn’t a technological change but an accomplishment of companies like Google, Facebook, and Amazon.
Building alternative futures starts by rejecting the notion that technology is neutral or inevitable, and instead recognize that they are shaped by social systems, values, incentives, and organizational contexts, and can be made anew.
E.G.
In “Do Artifacts Have Politics,” scholar Langdon Winner tells the story of Robert Moses, an infamous urban planner in NY from the 1920s to the 1970s. Moses built the overpasses on Long Island so extraordinarily low to prevent public buses, which were more likely to be used by racial minorities and low-income groups, from visiting Jones Beach State Park. The details of this story have since been challenged, but Winner’s overall point remains: Moses built his beliefs into the overpasses. This isn’t just a phenomenon in the physical world — scholars like Ruha Benjamin and Safiya Noble have shown how engineers and designers often encode racist values in technical systems.
Resources
Approaches
Untangle technology - Interrogate how technology is made, and situated in a specific social, cultural, and organizational context.
What are the affordances of the technology?
How might the affordances reflect dominant cultural values?
How might the affordances reflect the business model of the company or its organizational structure?
What do we know about how the technological affordances might impact, people, behavior, and norms -- and what remains uncertain?
How might the affordances impact different people -- with different social positions and power -- differently?
Enjoy this Guide? Send me a quick note and let me know.