📻 What’s really happening when an algorithm makes a mistake.
An interview with Mike Ananny, associate professor of communications and journalism at the University of Southern California
Hi, and welcome back to the podcast edition of Untangled. This week, Microsoft announced that it has integrated ChatGPT into its Bing search engine. If you’re already an Untangled subscriber, you were prepared. I wrote just last week about why using AI chatbots as a search system would be a big ol’ problem. Don’t ever want to be caught off guard again? Subscribe to Untangled today!
One li’l benefit of being a paid subscriber is getting access to podcast episodes like this one before everyone else. It’s like going to Disneyland and skipping to the front of the line on every ride. The dream! Plus, you get access to every issue of Untangled, including special issues like the primer, as well as the full archives.
Now, on to the show.
On this episode, I talked with Mike Ananny, associate professor of communications and journalism at the University of Southern California. We talked about algorithmic errors: how we make them, what they say about our view of the world, and how we might think of them as public problems that require collective action.
The conversation is at times conceptual — e.g. we delve into why algorithms are sociotechnical phenomena, not technological things. At other times it gets weeds-y, as Ananny describes how certain algorithmic systems function in practice. But don’t let either deter you from listening to the end. The range of this conversation reflects Ananny’s unusual ability to apply a technical, social, economic, and cultural lens to any algorithmic error. It’s a feature, not a bug.
Analyzing errors in varying ways might sound like a small thing. But the shift from seeing things as technical to sociotechnical — and the corresponding shift to see errors, problems, or mistakes as made not found — is actually a very big thing. Once you rearrange your brain to see the world this way, you cannot help but see errors as entangled in systemic, structural problems. All of a sudden, the solutions can no longer be the purview of a single private company. As I wrote in “Beyond ‘minimizing harms,’”
Transforming systems starts by recognizing that these systems aren’t just technical, they’re ‘sociotechnical’. It starts by recognizing that ‘tech problems’ are never just problems of technology. It starts by recognizing that errors or mistakes can’t be solved by a private company or a single institution. As Ananny concludes, it starts by recasting these errors as public problems that demand collective action.
I couldn’t have said it better myself! 😉
With that hilarious joke behind us, don’t forget to like the podcast, subscribe to it on Apple or Spotify, review it, rate it, and share it. It really does make a difference.
🎧 Pod Archives
If you liked the conversation with Mike, here are a few wonky gems from the archives:
As always, if you have ideas about how to make the newsletter or podcast better, tell me. Or if there’s anything I could do to make your stay at Untangled more enjoyable, press ‘reply’ and let’s chat.
Until next time,
Charley
I love this newsletter, and I’m only just starting to catch up on some of the archives. I also listen to a ton of podcasts, so I was excited to add this to my queue - however, I wasn’t able to find this episode, even on the Apple Podcast and Spotify links provided. Was it intentionally removed from the RSS feed?