How to anticipate technology’s impact on different communities
PLUS: Untangled topped 9K subscribers & I co-authored an essay for Tech Policy Press.

Overview
How one experiences a new technology depends on their societal position or status in society. Indeed, technologies tend to disproportionately harm those at the margins of society. This is exacerbated by the propensity to design for scale and, in turn, the median user, rather than leaving space for contextually sensitive technologies, which might account for a range of societal positions. Or, heaven forbid, design products or govern systems from the perspectives of those on the margins!
This guide will help you anticipate technology’s impact on different communities and includes:
An example that illustrates the idea.
Additional resources to dive deeper.
Practical approaches to apply the Guide in your working context.
E.G.
Blockchains record your transactions on a public ledger. But those transactions aren’t linked to your identity — they are linked to an alphanumeric, private address. Anyone can see your transactions but they don’t know that it is you buying those medications. That’s the idea of pseudonymity.
In ‘Is pseudonymity the answer?’ I drew upon the work of Michelle Lamont and Alice Marwick to argue that whether pseudonymity protects your safety or your power depends on your position in society. When pseudonymity is taken away, it’s worth asking whether further visibility reveals your power or makes you vulnerable. As a white guy who writes a newsletter, visibility is a good thing. But that fact reveals my societal privilege. For many, visibility means vulnerability or precarity.
Resources
Approaches
Center the margins. Since technologies are designed for scale and the median user, those on the margins are often disproportionately harmed, and offer unique perspectives. Ask yourself questions like:
What groups have historically been disproportionately harmed or excluded by the sociotechnical system?
How might you center the perspective of marginalized groups and those disproportionately harmed in the design, deployment, and governance of the sociotechnical intervention?
In such an approach, members of marginalized communities would lead the design and governance process. They would make decisions about how their data is governed, which technologies are used and banned, how abuse is handled, and how their identity is captured.
Okay, that’s it for now,
Charley