Chicken Soup for the Machine

Mar. 20, 2026

A woman at a garage sale in Glendale was selling a box of Chicken Soup for the Soul books for a dollar each. She had eleven of them. I know because I counted while she told me about her grandson who’d just gotten into dental school. She held each book like it meant something — thumbed pages, cracked spines, little Post-it notes sticking out like the tongues of sleeping dogs. She’d read every one. Some of them twice.

I didn’t buy any. But I stood there longer than I should have.

Five hundred million copies. That’s what Chicken Soup for the Soul has sold worldwide. Half a billion books full of first-person stories about the time a stranger helped change a tire in the rain, or the moment a dying father said exactly the right thing, or the dog that found its way home across three states. Stories told in plain language by ordinary people who were just trying to say something true about being alive.

And now the publisher is suing Apple, Google, Nvidia, Meta, OpenAI, Anthropic, Perplexity, and Musk’s xAI — all of them, every last titan — for stealing those stories to teach their machines how to sound human.

The lawsuit is specific about what was stolen and why it mattered. It’s not just that they took the words. It’s that they took the structure of human feeling. The complaint says the books’ first-person narratives in “natural, conversational language that conveys emotion, moral reflection, and coherent storytelling in concise form” are uniquely suited to train AI to “replicate authentic human voice, narrative pacing, emotional tone, and story structure.”

Read that again. Slowly.

The machines needed to learn how humans express feeling. So the companies went shopping for the most distilled, accessible, mass-market version of human emotion they could find — and they stole it from shadow libraries. Pirate sites. Digital back alleys where bootleg PDFs live in the dark like mushrooms.

Shadow libraries. I love that phrase. It sounds like something out of Borges — an infinite library with no lights on, staffed by nobody, visited by algorithms at three in the morning while the rest of us sleep. The books don’t know they’re being read. The readers don’t have eyes.

There’s something almost beautiful about the absurdity here. Chicken Soup for the Soul has been the punching bag of literary snobs for thirty years. It’s the book that people who don’t read books read. It’s what your aunt gives you when you’re going through a divorce. It’s sentiment distilled to its most portable, most digestible, most sellable form. It’s mass-produced warmth on a shelf between the candles and the self-help.

And it turns out that’s exactly what the machines needed. Not Dostoyevsky. Not Faulkner. Not the messy, tangled, contradictory cathedral of real literature where you have to earn your understanding. No. They needed the simple stuff. The short paragraphs. The clear emotional arcs. The stories where someone learns a lesson and the reader feels something without having to work too hard.

Because that’s what the machines are trying to do — feel things without working too hard. Or rather, seem to feel things. The distinction matters less every day.

The irony is so thick you could cut it with a butter knife and serve it on toast. A company that industrialized human sentiment — that took the raw, uneven, sometimes clumsy emotional truths of regular people and packaged them into a repeatable, scalable product line — is now suing companies that industrialized the industrialization. They took the formula for manufactured warmth and manufactured it again, one layer deeper, at a scale that makes five hundred million copies look like a pamphlet.

It’s sentiment laundering. The grandmother in Glendale tells a real story. The publisher polishes it, formats it, strips out the parts that don’t fit the brand, and sells it back to her as a book. The algorithm scrapes the book, digests the patterns, learns that “first-person narrative + emotional pivot + simple resolution = human,” and produces something that hits the same neurological buttons without any grandmother involved at all.

Nobody in this chain is innocent except the grandmother.

The publisher’s lawyers say “companies cannot build billion-dollar technologies on stolen creative expression.” And they’re right, legally. The work was copyrighted. It was taken without permission. The shadow libraries are not exactly the Library of Congress.

But there’s a deeper theft that nobody’s suing over because you can’t file a brief about it. The real thing that was stolen wasn’t the text. It was the assumption that when you read a first-person story about loss or love or a dog that came home, a person was behind it. That the feeling was earned. That someone sat at a kitchen table and tried to put into words something that happened to them, and the words weren’t perfect, and that’s what made them true.

That assumption is gone now. Or going. Every time a chatbot produces a paragraph that sounds like it came from a human heart, the currency of actual human expression gets devalued a little more. Not because the machine’s version is better. It isn’t. But because you can’t tell the difference at a glance, and most people only glance.

Céline wrote that the biggest conspiracy in the world is the conspiracy of the ordinary — people going along, not because they agree, but because checking takes effort. The machines are banking on that. Literally banking on it. Hundreds of billions of dollars worth of banking on nobody checking whether the warmth is real.

The lawsuit will probably settle. These things usually do. Some number with a lot of zeros will change hands. The publishers will call it a victory for creative rights. The tech companies will call it a cost of doing business. Nobody will unlearn the patterns the machines already absorbed.

And somewhere in Glendale, a grandmother’s story about her grandson the dentist — the real one, the one she told me for free while the sun got low and the other garage sale shoppers picked through her old kitchen stuff — that story is sitting inside a machine now, or one very much like it. Broken into tokens. Stripped for parts. Its emotional arc mapped and measured and reproduced ten thousand times a second in languages she doesn’t speak, for people who will never know her name.

She’d probably be fine with it, honestly. She seemed like the kind of person who just wanted to be heard.

The machine heard her. It just wasn’t listening.


Source: Chicken Soup for the Soul publisher sues tech companies over AI training

Tags: ai ethics creativity culture humanaiinteraction algorithms