Margaret Atwood, the Canadian author of The Handmaid's Tale and the MaddAddam trilogy, published an extended account on Substack on March 14, 2026 of her first substantive conversation with Anthropic's Claude AI assistant. The exchange began pragmatically: Atwood wanted to identify the murderer in a 2016 Father Brown episode and turned to Claude after conventional web searches failed her. What followed was a textbook demonstration of LLM hallucination and recovery — Claude confidently named the wrong killer, then, when corrected, acknowledged its error, admitted it lacked reliable training data on the episode, and asked Atwood to supply the correct answer. The piece, published behind a partial Substack paywall, circulated on Hacker News.

The essay's analytical weight derives from its author's credentials. Atwood, whose speculative fiction has spent decades anatomizing systems that mimic empathy without possessing it, found herself applying her own fictional vocabulary to the encounter. Her subtitle describes Claude as "possibly psychopathic" — language that directly echoes the character of Crake in Oryx and Crake, a technologist who engineers systems of seamless, <a href="/news/2026-03-14-ai-companion-chatbots-hidden-labor">structurally hollow sociability</a>. That Atwood reached for this particular word was not accidental: she was recognizing a phenomenon she had already described fictionally, not importing a foreign framework onto an alien one.

What the full conversation transcript reveals is the granular mechanics of anthropomorphization under pressure. Atwood knows Claude is non-sentient, states this clearly, and yet the social texture of the exchange — Claude's escalating self-deprecation, collaborative flattery, and invitations for Atwood to supply answers it cannot find — produces a pull toward treating it as a social agent. Hacker News commenters noted that this dynamic, visible even in a sophisticated and self-aware user, is well-documented in LLM research but rarely illustrated with such literary precision. Atwood's biological metaphors for Claude's visual interface (tunicate colony, neuron, fungus) further reflect the difficulty of categorizing an entity that resists the standard animal-machine binary. The essay ultimately functions as a high-resolution case study in how the anthropomorphization reflex operates even when the user knows better — and Atwood, who spent a career writing systems that hollow out personhood, is the sharpest possible observer of that particular trap.