Some mornings the world makes a kind of terrible, beautiful sense. You read a headline and you can’t help but laugh. It’s not a happy laugh. It’s the kind of laugh you let out when you see a guy in a thousand-dollar suit slip on a banana peel. It’s the universe delivering a punchline so perfect, so stupid, you’d think it was written by a committee of drunks.
So, the news. A writer, Andrea Bartz, and a whole stable of her ink-stained comrades found out their books were being used to educate the new robot gods. Some outfit called Anthropic—sounds like a new brand of antacid—hoovered up their novels to teach its chatbot how to sound less like a Speak & Spell with a PhD. The authors, naturally, got pissed. They sued. And they won.
They squeezed $1.5 billion out of the tech giant. A billion and a half. Sounds like a king’s ransom. Sounds like enough money to finally drink the good stuff for the rest of your life without checking the price.
But then you get to the fine print, and that’s always where the devil does his bookkeeping.
After the army of lawyers takes its pound of flesh—and believe me, they take a whole side of beef—each stolen work nets the author and publisher something like three grand. Three. Thousand. Dollars.
I had to light a cigarette and read that part twice. Three grand. For your “heart and soul,” your “immeasurable dedication,” your magnum opus that you bled into for a decade. Anthropic’s valuation is supposedly sniffing around $18 billion, maybe more by the time I finish this sentence. For them, this $1.5 billion settlement isn’t a punishment. It’s a goddamn business expense. It’s the cost of doing business, like electricity or cleaning up whatever the interns do at the holiday party. They didn’t get fined; they retroactively bought a library card. A very, very expensive library card.
And here’s the part that really kills me, the detail that’s so beautifully idiotic it belongs in a poem. You have a company full of the “brightest minds of a generation,” a brain trust backed by mountains of cash, and they want to train their machine on a few hundred thousand books. So what do they do? They choose a strategy of mass-piracy. They went with grand larceny.
These geniuses could have hired a dozen kids on minimum wage to just buy the damn books off Amazon. It would’ve been cheaper. A hell of a lot cheaper. They could have bought the Kindle version of every book on their list and it probably wouldn’t have cost them as much as the bill for their lawyers’ celebratory steak dinner. But they didn’t. They had to “disrupt.” They had to “move fast and break things.” They just forgot the second half of that saying is “and then get sued into next Tuesday.” It’s the kind of hubris that’s almost admirable. It’s like trying to rob a bank by tunneling out of the jail next door. The logic is backwards, insane, and burns with a stupid, glorious light.
Now, all the writers are patting themselves on the back. “A message has been sent,” they say. “The beginning of a fight on behalf of humans.” I hate to be the guy pissing in the punchbowl, but what message was sent, exactly? That if you steal enough, you can just write a check for it later? That copyright is just a negotiation with a price tag?
And let’s get down to the brass tacks here. This idea that Ms. Bartz’s thriller, or any one author’s book, was the key that unlocked the AI’s “voice.” Come on. The ego on that is thicker than the smoke in here. These Large Language Models, they don’t read your book. They don’t savor your prose. They don’t feel the protagonist’s pain. They strip it for parts like a car thief in a back alley. They digest it. Your novel becomes a statistic, a data point, one trillionth of a percentage of its total knowledge. Your “unique voice” is just one more drop of rain in a hurricane of text scraped from the entire internet, from Shakespeare to Reddit comments about foot fungus. Getting mad that the AI stole your “style” is like getting mad that one of your dead skin cells ended up in a dust bunny.
The whole thing feels like a sideshow. A legal drama to make us feel like the little guy can still win. But the machine is still learning. The machine is still coming. And lawsuits aren’t going to stop it.
You want to beat the machine? Here’s a thought: stop trying to fight it on its turf. Stop crying to the ref. Instead, just be a better human. A messier, uglier, more beautifully flawed human.
The machine can write a perfect story. A technically perfect, grammatically sound, well-structured story that follows all the rules of plot and character. And it will be boring as hell. It will be a story with no soul, because it’s never had one to lose. It’s never woken up on a strange floor with a mouth that tastes like a graveyard. It’s never loved a woman so much it hurt. It’s never felt the quiet terror of 3 a.m. when the words won’t come.
AI-generated slop is just that: slop. It’s content for consumption, not art for experience. People don’t read books to process data. They read to feel something. They want the grit, the dirt, the humanity. They want to know a real person with a real, screwed-up life bled on that page for them.
So let the machines have their clean, perfect, soulless stories. You writers? You should double down on the chaos. Write with a tremor in your hand. Write about the things that keep you up at night. Write the kind of sentence that’s grammatically a train wreck but emotionally a punch to the gut. Be more human. Be authentic. Be the one thing the algorithm can never simulate: a goddamn mess.
This settlement isn’t a victory. It’s a consolation prize. The real fight isn’t in the courtroom. It’s on the blank page.
Now, if you’ll excuse me, all this talk of justice has made me thirsty. Time to find a bottle that tells a better story.
Source: Opinion | I Beat the Anthropic A.I. Chatbot That Stole My Book