🤯 Technically Social
A special issue! What actually IS technology? How does it shape social systems?
Hi there, and welcome back to Untangled, a newsletter and podcast about technology, people, and power. April was busy busy:
I analyzed a practical proposal designed to address platform governance at scale: citizen assemblies! and then created an audio version of the essay.
I shared the first major update to the Primer. As a reminder, the Primer is a special issue that consists of the frameworks and concepts from past essays, organized nicely into themes. Within each theme, I offer further resources if you want to dive deeper, and I recommend a daily action for you to make these ideas more practical.
In a critique of tech-criticism, I applied the concept of “criti-hype” to the public discourse on AI, and then offered a list of my favorite newsletters that transcend this trap.
This month I have something different for you — it’s a new, special issue, called “Technically Social,” which will live under the newly created ‘Special Issues’ tab. Let me know what you think. If you find it valuable, consider subscribing to the paid edition of Untangled.
Now, on to the show.
Every time I write the word ‘technology’ there’s a li’l voice in my head that laments the oversimplification: “what are you even talking about when you say ‘technology’?” Welcome to my brain 🙃
We take what we mean by ‘technology’ for granted. In this special issue of Untangled, I want to ask the uncomfortable question: uh, what even is ‘technology’? What follows isn’t close to comprehensive but it’s a good start for anyone who wants to understand what technology is and the mechanisms by which it is entangled in social systems.
This special issue is somewhat similar to the Primer I wrote in December 2022, in that this is structured as a list of themes or ideas that I will continually add to over time. That primer focuses on how social systems (e.g. race, gender, power, culture, narratives, etc.) shape technology, whereas in this issue — Technically Social — I will focus on how technology shapes social systems: the materiality of the technology, how it is designed and developed, what it affords, whether it is a tool or arranged in a system, how its effects are ecological.
Like the Primer, I’ll continue to add to this list over time. Anyone will be able to access it for the month of May, after which this document — and subsequent iterations — will only be for paid subscribers. Let’s dig in.
1. Technology evolves in combination with other technologies
One answer to the question, ‘uh, what even is technology’ comes from Brian Arthur in his book, ‘The Nature of Technology: What It Is and How It Evolves.’ Basically, Arthur argues that technologies are created out of existing technologies. This might sound a li’l recursive, but as Arthur explains:
“Any solution to a human need — any novel means to a purpose — can only be made manifest in the physical world using methods and components that already exist in the world. Novel technologies are therefore brought into being — made possible — from some set of existing ones. The jet engine could not have existed without compressors and gas turbines, and without machine tools to manufacture these with the precision required.”
My essay, Are blockchains immutable? gets closest to this idea. Blockchain technology wasn’t created out of thin air — it evolved in combination with myriad other technologies, like cryptography, hashing algorithms, distributed ledgers etc. It’s in part the intrinsic material qualities of the technology — alongside affordances and values (more on that in a minute) — that shape this combination. But we often abstract away from the materiality of the technology, getting lost instead in the narratives and hype around what technology might be able to do one day. So interrogating the material qualities of the technology — what are they, where do they come from, what resources do they require, what other technologies do they depend upon — can actually make us less susceptible to unhinged narratives and “criti-hype.”
Anyway, Arthur’s observations provide some explanation on how technologies evolve out of other ones — which gives us a sense of ‘supply’. But what about demand? Well, ‘demand’ isn’t quite the right term, since some technologies (e.g. penicillin) come into being not because they met a specific demand but because they represented “niches they could usefully occupy.” To capture both ideas, Arthur uses the term “opportunity niches” which reflects human needs and the needs of technologies themselves; it’s not just that technologies are created to meet a specific need, but that those technologies call forth additional technologies. As Arthur writes, “every technology requires supporting technologies: to manufacture it, organize for its production and distribution, maintain it, and enhance its performance. And these in turn require their own supporting technologies.” And there it is — technologies evolving out of other ones.
Moreover, some technologies are extra special; they become building blocks upon which other technologies are created. Neural networks, for example, are used in 3D printing, robotics, cloud services, battery technology, programable biology and therapies. This starts to get at what Arthur calls “combinatorial evolution” or the idea that technologies evolve by combining with other technologies.
2. Technology shapes societal values. But technology itself was shaped by the values and beliefs of its designers. Um, what’s going on?
How the technologies combine and evolve is informed by the values and beliefs of their creators and how those interact with the values in a given society. Stepping back, one theme of Untangled is that the beliefs and values of engineers and designers become encoded in the technologies themselves. Right, technologies aren’t neutral — they don’t originate out of thin air. They are designed by someone, somewhere, whose beliefs and normative commitments subtly shape its development. For example, scholars like Ruha Benjamin and Safiya Noble have shown over and again how engineers and designers encode racism in technical systems.
You might be wondering, “uh, but this special issue is about how technologies shape social systems, not the other way around.” True! But the value-laden technologies are then let loose on the world, and proceed to shape social values, impact people, and call forth aligned technologies:
Technologies create self-propagating values: Since those engineering and designing the technical systems tend to reflect dominant societal views, we often create a self-propagating feedback loop wherein the values or beliefs encoded in the technology are then reaffirmed by values and beliefs out in the world. And around and around we go! This isn’t always the case — conflicting societal values can dampen the effect of those values encoded in the technology, and vice versa, but that depends ultimately on seeing the technology as anything but neutral, evolving our values, and ensuring more people from minoritized social positions are doing the engineering.
The impacts of technologies are the most important part: While the intent behind these values matters, impact matters more. Even if it’s clear that a designer wasn’t malicious, or even just making decisions subconsciously rooted in discrimination, the discriminatory impact of their technology was still very real, and should not be minimized. Even if Robert Moses wasn’t a total racist, the overpasses he built in New York still prevented racial minorities and low-income groups from visiting Jones Beach State Park.
While I think Arthur is correct in saying that technologies evolve in combination with other technologies, I think we can go further. Technologies tend to evolve in combination with technologies that are values-aligned. For example, it’s not just that permissionless blockchains evolved in combination with cryptography. It is that blockchain technology encoded a cyber libertarian ethos that wanted to resist censorship, remove centralized authority, and put the individual front and center — and the values and affordances of cryptography fit with it like a puzzle piece.
3. Technology shapes and constrains human agency
Speaking of affordances, technologies have them in spades. Technologies shape and constrain people — the way our bodies are read and represented, where we focus (or don’t!) our attention, the actions we can (or can’t!) take, the decisions we make, etc. You can understand the comparison I offered between Google Search and ChatGPT-based search as a comparison of affordances — they nurture different kinds of behaviors:
“How we understand search is intertwined — some might say ‘entangled’ — with the technologies of the day and the social behaviors they encourage. One big issue with chatbots like ChatGPT is that it encourages the acceptance of a direct answer. Whereas Google might offer a mix of high-quality and clickbaity information sources from which to choose, ChatGPT generates a single plausible response with an air of authority.”
If you start thinking about how different technologies alter your behavior, you’ll start to see affordances everywhere, hiding in plain sight.
has actually outlined a whole bunch of them in this post from (one of my favorite newsletters!). Each question can be read as revealing a kind of technological affordance.But basically, we can collapse most affordances into two categories: technologies that extend our capabilities and those that restrict them. Here is Ivan Illich, Catholic priest, philosopher, and social critic, making a variation on this point in 1973:
“There are two ranges in the growth of tools: the range within which machines are used to extend human capability and the range in which they are used to contract, eliminate, or replace human functions. In the first, man as an individual can exercise authority on his own behalf and therefore assume responsibility. In the second, the machine takes over—first reducing the range of choice and motivation in both the operator and the client, and second imposing its own logic and demand on both.”
Affordances matter, but to say that ‘technology shapes and constrains’ is too deterministic — it makes it sound like the effects of the technology are baked into the technology itself. As I wrote in Are blockchains immutable? the internet of today is nothing at all like the internet of the 1990s, which emphasized affordances like anonymity and ephemerality. What changed wasn’t a technological change but an accomplishment of companies.
Nevertheless, it’s important to understand the subtle ways in which technologies nudge and nurture certain behaviors — especially because these days, it feels a li’l less subtle and nudge-y, and more controlling. That leads us to the distinction between technologies as tools and technologies as a system.
4. Technology is a tool or a system
The extent to which technologies shape and constrain our agency depends on whether they are a tool or a system. Lewis Mumford, a historian and philosopher of technology, divides technologies into “authoritarian technics” and “democratic technics.” Here’s a brief breakdown:
Democratic technologies are “man-centered, relatively weak, but resourceful and durable.” These are tools; they are used by people in service of their objectives. The objectives might still be real bad — a hammer can be used as a weapon — so it’s the user’s objective that matters.
Authoritarian technologies are “system-centered, immensely powerful, but inherently unstable.” These are technologies that become a system — which then acts on people, rather than the other way around: power shifts from the people, to the system.
Take the Moses’ overpass again. The story isn’t simply that Moses embedded his politics into the overpasses, but that together, the overpasses became a system that regulated behavior. As I referenced in Technologies encode politics Langdon Winner referred to this as “specific ways of organizing power and authority.”
Over time, many ‘tools’ have become ‘systems’. Why’s that? Well, code is its own kind of regulatory system, or as Lawrence Lessig put it, “code is law.” Lessig expands on this foundational point in his book, Code and Other Laws of Cyberspace, writing,
“In real space, we must recognize how laws regulate — through constitutions, statutes, and other legal codes. In cyberspace, we must understand how a different ‘code’ regulates — how the software and hardware (i.e. the ‘code’ of cyberspace) that make cyberspace what it is, this code is cyberspace’s ‘law’.”
Code regulates our behavior, constraining the decisions we make, shaping the decisions made about us, and who gets to make them. This is clear when we examine the tools that have historically been used to classify people. Before the internet, we had surveys. Now, the stuff you click on determines ‘who you are’ for an abstract group of entities. But both evolved the broader system or architecture of classification. In Who are you? I wrote:
“Google uses our data to classify our gender, our needs, and our values. But they aren’t actually ‘ours’ in any meaningful sense — Google owns the data and they don’t care how we self-identify. These classifications are used to computationally calculate a profile with a bunch of different categories that they sell to advertisers.”
So essentially, the classification system that Google perpetuates makes self-identification irrelevant, and instead it groups people into marketing categories without their knowledge. Furthermore, both surveys and predictive models are systems that work on codifying judgments made in the past: a survey check-box or audience segmentation tool does not ‘imagine’ how people might want to be classified; it simply creates categories based on past and present information.
Users are especially beholden to algorithmic systems of classification: at least with a state-run survey, there is an opportunity to question why certain check-box classifications were included or excluded. But there is no deliberative process to decide how people should/shouldn’t be classified online. Our algorithmically determined advertising profiles are tweaked and updated on a daily basis, and facilitated by systems that are designed by engineers who are unaccountable to the public.
We hardly ever reflect on the ways in which we are classified. That’s because technological systems fade into the background, often becoming invisible to us — but they’re never not acting on us. Indeed, technology is ecological, rearranging what came before it.
5. Technology is ecological
Technology alters ecosystems. I’m not the first to make this point. Neil Postman argued long ago that “Technological change is not additive; it is ecological,” explaining that “A new medium does not add something; it changes everything.” As technologies combine and recombine with one another — per Arthur’s point above — and an opportunity niche is met, societies evolve too. Systems don’t remain the same, but just with ‘more technology’; they become altogether different ecosystems. Postman looks to history and writes,
“In the year, 1500, after the printing press was invented, you did not have old Europe plus the printing press. You had a different Europe. After television, America was not America plus television. Television gave a new coloration to every political campaign, to every home, to every school, to every church, to every industry, and so on.”
A more recent, Untangled-approved example is how the use of image and video-based algorithmic recommendation systems — like those underpinning TikTok — isn’t just the U.S. plus those technologies. As I wrote in “TikTok & the production of ignorance,” the technology itself compounds the existing sense of societal speculation and uncertainty, and therefore, “encourages a kind of volatile immersion. It feels a li’l chaotic but you’re nevertheless engrossed and stimulated, left to wonder hours later how a string of bizarre videos kept your attention for so long.” It’s a disorienting experience that Aris Komporozos-Athanasiou likens this experience to “the legibility and logic of a narcoleptic dream.”
But this experience, indeed, our relationship to the technology is one-way. We don’t know how the algorithmic system is determining what to show us next. As I wrote then, “It’s this gap — between what the technology helps us see and all that remains hidden — that creates an underlying sense of unease or uncertainty.” U.S. society isn’t ‘the same as it was plus ‘a new way to consume short-form videos’ — its psycho-social ecology is different.
So too is TikTok altering the kind of relationships we have with one another — disrupting the social graph, ushered in by Facebook and earlier social media sites, and encouraging more one-sided relationships. What this new ecology looks like isn’t clear, nor is it fixed — it’s ever-evolving in combination with other technologies that nudge social dynamics and rearrange culture.
That’s it for this special issue of Untangled. Please let me know if you enjoyed this format. If you have an idea for a different format, click ‘reply,’ and send me an email. I’d love to hear from you. As always, thank you to Georgia Iacovou, my editor.