How to anticipate technological shifts in power
A guide to interrogating the technological systems that shape power
Overview
Technologies can shift who has power. Lewis Mumford, a historian and philosopher of technology, divides technologies into “authoritarian technics” and “democratic technics.”
Democratic technologies are “human-centered, relatively weak, but resourceful and durable.” These are tools; they are used by people in service of their objectives. The objectives might still be really bad — a hammer can be used as a weapon — so it’s the user’s objective that matters.
Authoritarian technologies are “system-centered, immensely powerful, but inherently unstable.” These are technologies that become a system — which then acts on people, rather than the other way around: power shifts from the people, to the system.
Technologies can empower people, putting the objectives of their users first. Or they can shift power from the individual user to the broader technical system. Take the Moses’ overpass again from “How to critically analyze technology.” The story isn’t simply that Moses embedded his politics into the overpasses, but that together, the overpasses became a system that regulated behavior. As I referenced in Technologies encode politics Langdon Winner referred to this as “specific ways of organizing power and authority.” Anticipating technological shifts in power, then, starts by analyzing how technologies become systems that shape power.
E.G.
Over time, many ‘tools’ have become ‘systems’. Why’s that? Well, code is its own kind of regulatory system, or as Lawrence Lessig put it, “code is law.” Code regulates our behavior, constraining the decisions we make, shaping the decisions made about us, and who gets to make them. This is clear when we examine the tools that have historically been used to classify people. Before the internet, we had surveys. Now, the stuff you click on determines ‘who you are’ for an abstract group of entities. But both evolved the broader system or architecture of classification. In Who are you? I wrote:
“Google uses our data to classify our gender, our needs, and our values. But they aren’t actually ‘ours’ in any meaningful sense — Google owns the data and they don’t care how we self-identify. These classifications are used to computationally calculate a profile with a bunch of different categories that they sell to advertisers.”
So essentially, the classification system that Google perpetuates makes self-identification irrelevant, and instead it groups people into marketing categories without their knowledge. Furthermore, both surveys and predictive models are systems that work on codifying judgments made in the past: a survey check-box or audience segmentation tool does not ‘imagine’ how people might want to be classified; it simply creates categories based on past and present information.
Users are especially beholden to algorithmic systems of classification: at least with a state-run survey, there is an opportunity to question why certain check-box classifications were included or excluded. But there is no deliberative process to decide how people should/shouldn’t be classified online. Our algorithmically determined advertising profiles are tweaked and updated on a daily basis, and facilitated by systems that are designed by engineers who are unaccountable to the public. This shift from state run surveys to algorithmically determined classification systems shifted power from the public to engineers working for private companies.
Resources
Approaches
Think change, not addition
Technologies aren’t additive -- they alter existing systems. They alter patterns of behavior and relationships within a system, and in turn, shift power. Before you can anticipate how technologies shift power within a system, you first have to map it. If you haven’t yet, review the “Stakeholder & Power Mapping” approaches in “How to analyze the frames and metaphors that hide power in AI.”
Once you’ve mapped the stakeholders and power dynamics in your system, you can anticipate how new technologies might change them:
Behavior: How might the technology alter patterns of behavior among system stakeholders?
Relationships: How might it alter patterns of relationships among system stakeholders?
Culture: How might it alter normative patterns among system stakeholders?
System Dynamics: How might it alter what is dynamically changing within my system?
Power: How might it alter patterns of power between system stakeholders?
Okay, that’s it for now,
Charley