🤯 Technically Social
A special issue! What actually IS technology? How does it shape social systems?
Hi there, and welcome back to Untangled, a newsletter and podcast about technology, people, and power. April was busy busy:
I analyzed a practical proposal designed to address platform governance at scale: citizen assemblies! and then created an audio version of the essay.
I shared the first major update to the Primer. As a reminder, the Primer is a special issue that consists of the frameworks and concepts from past essays, organized nicely into themes. Within each theme, I offer further resources if you want to dive deeper, and I recommend a daily action for you to make these ideas more practical.
In a critique of tech-criticism, I applied the concept of “criti-hype” to the public discourse on AI, and then offered a list of my favorite newsletters that transcend this trap.
This month I have something different for you — it’s a new, special issue, called “Technically Social,” which will live under the newly created ‘Special Issues’ tab. Let me know what you think. If you find it valuable, consider subscribing to the paid edition of Untangled.
Now, on to the show.
Every time I write the word ‘technology’ there’s a li’l voice in my head that laments the oversimplification: “what are you even talking about when you say ‘technology’?” Welcome to my brain 🙃
We take what we mean by ‘technology’ for granted. In this special issue of Untangled, I want to ask the uncomfortable question: uh, what even is ‘technology’? What follows isn’t close to comprehensive but it’s a good start for anyone who wants to understand what technology is and the mechanisms by which it is entangled in social systems.
This special issue is somewhat similar to the Primer I wrote in December 2022, in that this is structured as a list of themes or ideas that I will continually add to over time. That primer focuses on how social systems (e.g. race, gender, power, culture, narratives, etc.) shape technology, whereas in this issue — Technically Social — I will focus on how technology shapes social systems: the materiality of the technology, how it is designed and developed, what it affords, whether it is a tool or arranged in a system, how its effects are ecological.
Like the Primer, I’ll continue to add to this list over time. Anyone will be able to access it for the month of May, after which this document — and subsequent iterations — will only be for paid subscribers. Let’s dig in.
1. Technology evolves in combination with other technologies
One answer to the question, ‘uh, what even is technology’ comes from Brian Arthur in his book, ‘The Nature of Technology: What It Is and How It Evolves.’ Basically, Arthur argues that technologies are created out of existing technologies. This might sound a li’l recursive, but as Arthur explains:
“Any solution to a human need — any novel means to a purpose — can only be made manifest in the physical world using methods and components that already exist in the world. Novel technologies are therefore brought into being — made possible — from some set of existing ones. The jet engine could not have existed without compressors and gas turbines, and without machine tools to manufacture these with the precision required.”
My essay, Are blockchains immutable? gets closest to this idea. Blockchain technology wasn’t created out of thin air — it evolved in combination with myriad other technologies, like cryptography, hashing algorithms, distributed ledgers etc. It’s in part the intrinsic material qualities of the technology — alongside affordances and values (more on that in a minute) — that shape this combination. But we often abstract away from the materiality of the technology, getting lost instead in the narratives and hype around what technology might be able to do one day. So interrogating the material qualities of the technology — what are they, where do they come from, what resources do they require, what other technologies do they depend upon — can actually make us less susceptible to unhinged narratives and “criti-hype.”
Anyway, Arthur’s observations provide some explanation on how technologies evolve out of other ones — which gives us a sense of ‘supply’. But what about demand? Well, ‘demand’ isn’t quite the right term, since some technologies (e.g. penicillin) come into being not because they met a specific demand but because they represented “niches they could usefully occupy.” To capture both ideas, Arthur uses the term “opportunity niches” which reflects human needs and the needs of technologies themselves; it’s not just that technologies are created to meet a specific need, but that those technologies call forth additional technologies. As Arthur writes, “every technology requires supporting technologies: to manufacture it, organize for its production and distribution, maintain it, and enhance its performance. And these in turn require their own supporting technologies.” And there it is — technologies evolving out of other ones.
Moreover, some technologies are extra special; they become building blocks upon which other technologies are created. Neural networks, for example, are used in 3D printing, robotics, cloud services, battery technology, programable biology and therapies. This starts to get at what Arthur calls “combinatorial evolution” or the idea that technologies evolve by combining with other technologies.
2. Technology shapes societal values. But technology itself was shaped by the values and beliefs of its designers. Um, what’s going on?
How the technologies combine and evolve is informed by the values and beliefs of their creators and how those interact with the values in a given society. Stepping back, one theme of Untangled is that the beliefs and values of engineers and designers become encoded in the technologies themselves. Right, technologies aren’t neutral — they don’t originate out of thin air. They are designed by someone, somewhere, whose beliefs and normative commitments subtly shape its development. For example, scholars like Ruha Benjamin and Safiya Noble have shown over and again how engineers and designers encode racism in technical systems.
You might be wondering, “uh, but this special issue is about how technologies shape social systems, not the other way around.” True! But the value-laden technologies are then let loose on the world, and proceed to shape social values, impact people, and call forth aligned technologies:
Technologies create self-propagating values: Since those engineering and designing the technical systems tend to reflect dominant societal views, we often create a self-propagating feedback loop wherein the values or beliefs encoded in the technology are then reaffirmed by values and beliefs out in the world. And around and around we go! This isn’t always the case — conflicting societal values can dampen the effect of those values encoded in the technology, and vice versa, but that depends ultimately on seeing the technology as anything but neutral, evolving our values, and ensuring more people from minoritized social positions are doing the engineering.
The impacts of technologies are the most important part: While the intent behind these values matters, impact matters more. Even if it’s clear that a designer wasn’t malicious, or even just making decisions subconsciously rooted in discrimination, the discriminatory impact of their technology was still very real, and should not be minimized. Even if Robert Moses wasn’t a total racist, the overpasses he built in New York still prevented racial minorities and low-income groups from visiting Jones Beach State Park.
While I think Arthur is correct in saying that technologies evolve in combination with other technologies, I think we can go further. Technologies tend to evolve in combination with technologies that are values-aligned. For example, it’s not just that permissionless blockchains evolved in combination with cryptography. It is that blockchain technology encoded a cyber libertarian ethos that wanted to resist censorship, remove centralized authority, and put the individual front and center — and the values and affordances of cryptography fit with it like a puzzle piece.
3. Technology shapes and constrains human agency
Speaking of affordances, technologies have them in spades. Technologies shape and constrain people — the way our bodies are read and represented, where we focus (or don’t!) our attention, the actions we can (or can’t!) take, the decisions we make, etc. You can understand the comparison I offered between Google Search and ChatGPT-based search as a comparison of affordances — they nurture different kinds of behaviors:
“How we understand search is intertwined — some might say ‘entangled’ — with the technologies of the day and the social behaviors they encourage. One big issue with chatbots like ChatGPT is that it encourages the acceptance of a direct answer. Whereas Google might offer a mix of high-quality and clickbaity information sources from which to choose, ChatGPT generates a single plausible response with an air of authority.”
If you start thinking about how different technologies alter your behavior, you’ll start to see affordances everywhere, hiding in plain sight.
has actually outlined a whole bunch of them in this post from (one of my favorite newsletters!). Each question can be read as revealing a kind of technological affordance.But basically, we can collapse most affordances into two categories: technologies that extend our capabilities and those that restrict them. Here is Ivan Illich, Catholic priest, philosopher, and social critic, making a variation on this point in 1973:
“There are two ranges in the growth of tools: the range within which machines are used to extend human capability and the range in which they are used to contract, eliminate, or replace human functions. In the first, man as an individual can exercise authority on his own behalf and therefore assume responsibility. In the second, the machine takes over—first reducing the range of choice and motivation in both the operator and the client, and second imposing its own logic and demand on both.”
Affordances matter, but to say that ‘technology shapes and constrains’ is too deterministic — it makes it sound like the effects of the technology are baked into the technology itself. As I wrote in Are blockchains immutable? the internet of today is nothing at all like the internet of the 1990s, which emphasized affordances like anonymity and ephemerality. What changed wasn’t a technological change but an accomplishment of companies.
Nevertheless, it’s important to understand the subtle ways in which technologies nudge and nurture certain behaviors — especially because these days, it feels a li’l less subtle and nudge-y, and more controlling. That leads us to the distinction between technologies as tools and technologies as a system.
Keep reading with a 7-day free trial
Subscribe to Untangled with Charley Johnson to keep reading this post and get 7 days of free access to the full post archives.