📰 Meta hired Hollywood actors, Anthropic experimented w/AI governance, and ChatGPT has different definitions of justice for Israelis and Palestinians.
ALSO: Am I, uh, Twitter??
This week, I’m synthesizing my favorite articles and essays over the last month. Let’s Untangle the news, shall we?

🪟 Thanks to Hugging Face and Trupic, we can now know whether an image was generated by AI and where it came from. That’s great — this is a meaningful step forward. But the gap between transparency tools and accountable governance is still a big one. If you want to read 2,000 words on what I mean by that, check out this essay from the Untangled Archives.
⏳ The origins of online discourse continue to shape what’s good and bad about the internet. That’s the starting point of a great essay by journalist Katie Notopoulos. As Notopoulos writes, “The existential problem is that both the best and worst parts of the internet exist for the same set of reasons, were developed with many of the same resources, and often grew in conjunction with each other. So where did the sickness come from? How did the internet get so … nasty?” Read the essay to find out how Notopoulos answers that question, and then listen to her and I talk about the time she exposed the real identities of the founders of the Bored Ape Yacht Club.
😶🌫️ What happens when information about a war is determined by an algorithm? Well, one journalist aptly called it an “algorithmically driven fog of war.” John Hermann called it “hellish and disorienting,”
wrote about TikTok’s contribution to this disorienting fog, and The New York Times explains how social media companies are abdicating their responsibility to news across the board. Want more Chayka? Listen to our conversation on DAOs or pre-order his new book, Filterworld: How Algorithms Flattened Culture.📻 Am I Twitter? Thankfully, no — I mostly avoided social media over the last decade so the platform couldn’t stick its cultural hooks in me. But the premise that we internalize the affordances of the platforms we give our attention to is an old idea from media scholars like Marshall McLuhan that was recently brought to life in this conversation with Ezra Klein on
. Check out PJ’s conversation with Ezra and then consider what the experience of being Untangled would feel like.🕵️ Meta hired actors during the Hollywood strike to train AI with their expressions and movements. The actors were paid as little as $300 and Meta can use their data for as long as they’d like. Read this great investigation from Eileen Guo. Want to dive into another investigation by Guo into how technology companies profit off of precarious labor? Listen to our podcast episode on her reporting on WorldCoin.
🤔 Ever wonder if your data is being used to train ChatGPT? It probably is, writes Lauren Leffer, in Scientific American, in an explanation of how companies use web crawlers and scrapers to collect training data. But this is a question we could definitively answer if we required companies to disclose the data they used to train their models. In doing so, we’d also learn if they’re using synthetic data, which, as I’ve chronicled, could lead to other big problems.
🤯 When journalist Mona Chalabi asked ChatGPT, “Do Israelis deserve justice,” the chatbot answered that “justice is a fundamental principle for Israelis.” But what about Palestinians? Well, that’s a “complex and highly debated issue” according to ChatGPT. ChatGPT seems to have adopted different definitions of justice for Israelis and Palestinians. This should offer a reminder that these tools aren’t truth-tellers but instead produce contorted, synthetic outputs based on our online discourse — in this case, revealing how we dehumanize certain groups.
🏢 AI systems should be more accountable to their users. In April, I made that argument about social media and analyzed the case for citizen assemblies. Well, this month, the generative AI company, Anthropic, tried something else — they asked 1,000 Americans to write the rules for their own chatbot. As Anthropic’s head of policy, Jack Clark put it, “We’re trying to find a way to develop a constitution that is developed by a whole bunch of third parties, rather than by people who happen to work at a lab in San Francisco.” I’ll have more to say on this in the future but for now, I’ll just say that this is a step in the right direction.
That’s it for this month’s roundup.
Have a lovely Sunday,
Charley