🗞️ GPT-4 failed the Turing Test; Algorithmic reparations; and the 'Church of Artificial Intelligence'
PLUS: Your chance to contribute to the next edition of Untangled
Hi, it’s Charley, and this is Untangled, a weekly-ish newsletter on technology, people, and power. A few things before getting into it:
🧠 Last week, I published the essay, “AI alignment isn’t a problem — it’s a myth” and offered a tour of unordered and ordered systems.
😊 Caitlin Dewey, who writes the great newsletter, Links I Would Gchat You If We Were Friends, recently recommended Untangled to her readers, describing it this way: “Always-edifying tech analysis…that doesn't just *tell* you what to think about AI … but teaches you how to think about it.” Please help me thank Caitlin (and do yourself a favor!) by signing up for her newsletter
👋 A warm welcome to new Untangled subscribers from The TechEquity Collaborative, New America, The University of California San Diego, and FutureScot.
In today’s issue, I’m contextualizing the best stuff I read all week, including papers and articles about:
💸 Algorithmic systems & reparations;
💭 AI systems & abstract reasoning;
🤯 The word of the year — hallucinate! — and why it bums me out.
🤖 How ChatGPT failed the Turing Test;
🙋♀️ AI, public input, and democratizing governance;
⛪ The rebooted ‘Church of Artificial Intelligence.’ Yep, really.
Let’s make Untangling the News a community effort — got an article or paper you think others might enjoy? Send it my way, and I might just include it in the next edition.
Okay, on to the show!
💸 Can algorithmic methods support reparations? According to a recent paper by Wonyoung So and Catherine D’Ignazio, the answer is ‘yes’ but it depends on a societal innovation, not an algorithmic one.
Keep reading with a 7-day free trial
Subscribe to Untangled with Charley Johnson to keep reading this post and get 7 days of free access to the full post archives.