Science journalist Andrew Lawler discovered that someone using the fake persona "Blake Whiting" published 13 books in a single week on Amazon's Kindle Direct Publishing platform. The books cover serious archaeological topics like the collapse of Near Eastern civilizations and Silk Road excavations. They're well-written and accurate, complete with convincing first-person introductions. One opens with "I first encountered news of this discovery while researching trade networks for an entirely different project."
But Blake Whiting doesn't exist. There's no author photo. No university affiliation or website either. The books are AI-generated composites that reshuffle the work of real researchers without attribution.
Lawler identified his own reporting in the Whiting books, along with the work of archaeologists Michael Frachetti, Farhod Maksudov, and Eric Cline. Cline, author of 1177 B.C.: The Year Civilization Collapsed, found his work reshaped into a book called 1177 BC Revisited.
Not a single footnote. No bibliography whatsoever.
Cline told Lawler it was "a complete rip off" but admitted "it reads beautifully and is accurate." Amazon reviewers had no idea the author was fake. One praised the "well organized chapters, clearly explaining what has been discovered." Another called it an "EXCELLENT overview."
This isn't old-school plagiarism where someone copies paragraphs. It's what Lawler calls "word-laundering on an industrial scale." AI tools synthesize multiple sources into something that looks original, stripping out direct quotes and citations to avoid obvious copyright violations. This technique is often highlighted in discussions regarding **LLMs Are Bullshit Machines**, which argues that hallucination is the architecture not a bug, and explores what happens when AI-generated text pollutes shared knowledge at scale. The operator behind Whiting likely isn't a lone author but a technically sophisticated outfit running algorithmic arbitrage on Amazon's platform. They know how to prompt large language models and how to game KDP's search algorithms, pumping high volumes of content into profitable academic niches.
Amazon claims it "carefully monitors" its catalog using machine learning and human reviewers. Yet it missed an author with no bio or online presence who exceeded the company's own 10-books-per-week limit. When Lawler contacted Amazon, spokesperson Jennie Bryant declined to comment on the Whiting books specifically. This situation echoes the types of missed opportunities and content creep described in **HN Forgotten AI Scandals**, where users have crowdsourced forgotten instances of AI industry malpractice.
The real damage hits researchers and journalists directly. Publishers won't commission a book on a topic when a Whiting-style volume already exists. Years of fieldwork and reporting get scooped by an algorithm and sold back to readers who think they're buying legitimate scholarship.