When True Crime Goes AI: Murder, Lies, and Content Creation

Feb. 17, 2025

Another Monday morning, another existential crisis over my coffee and aspirin. But this one’s special, folks. While you were all busy binge-watching true crime shows last night, I stumbled across something that makes my usual hangover seem almost quaint.

Remember those late-night YouTube rabbit holes where you convince yourself that watching “just one more” murder documentary is a good idea? Well, turns out some of those holes go deeper than we thought, and they’re filled with artificial snake oil.

Some genius decided to create an entire YouTube channel of fake true crime stories generated by ChatGPT. We’re talking titles like “Husband’s Secret Gay Love Affair with Step Son Ends in Grisly Murder” - the kind of clickbait that makes tabloid headlines look like Pulitzer material.

The best part? Nearly two million people watched this completely fabricated story supposedly set in Littleton, Colorado. That’s two million sets of eyeballs glued to a digital fever dream cooked up by an AI that probably thinks DNA evidence is something you find in a Twitter thread.

Here’s where it gets really interesting: the creator, who’s going by “Paul” (probably because “Captain Bullshit” was already taken), claims he did this to make people question why they’re so attracted to these grotesque stories. Right. And I drink bourbon for its medicinal properties.

But you know what? The sneaky bastard might have accidentally stumbled onto something profound. According to the stats, more than half of Americans consume true crime content regularly. Crime Junkie and Dateline NBC are sitting pretty in Apple’s top 10 podcasts. We’re a nation of murder enthusiasts, and I’m not sure what that says about us collectively as a species.

The whole thing reminds me of those mechanical fortune-teller machines at carnivals, except instead of dispensing vague predictions about tall, dark strangers, they’re spitting out stories about coaches giving HIV to cheerleaders. And people are eating it up like it’s gospel truth.

What gets me isn’t just the fake stories - it’s the way they tap into our collective appetite for the macabre. We’re sitting here, supposedly evolved beings, getting our kicks from AI-generated tales of domestic betrayal and violence. It’s like we’ve automated our own moral panic.

The real crime here isn’t the fake murders - it’s the murder of truth itself. We’re watching journalism get strangled by algorithms while we sit back and hit the like button. And the truly twisted part? The AI is probably better at crafting these stories than half the true crime podcasters out there.

But here’s the real kicker - this whole mess is just a preview of what’s coming. Today it’s fake murder stories on YouTube, tomorrow it’ll be AI-generated celebrity scandals, political conspiracies, and God knows what else. We’re entering an era where truth is optional and engagement is everything.

You want to know the really scary part? The system worked exactly as intended. These videos weren’t taken down because they were fake - they were taken down because someone actually bothered to check if the stories were real. Imagine that - actual journalism breaking up the AI party.

Look, I get it. We all love a good story. Hell, I’ve spent enough nights in bars listening to tall tales that would make ChatGPT blush. But at least when some drunk spins you a yarn about their cousin’s neighbor’s dog walker’s murder mystery, you know to take it with a shot of skepticism.

Maybe that’s what we need - a digital equivalent of bar wisdom. Something that reminds us that just because it’s on YouTube doesn’t make it true, and just because an AI can string together a compelling narrative doesn’t mean we should believe it.

For now, I’ll stick to my tried-and-true method of content verification: if I can’t fact-check it through my hangover, it probably isn’t worth believing anyway.

Time to pour another coffee and contemplate the state of human consciousness in the age of artificial storytelling. Or maybe just take another aspirin.

Stay skeptical, stay human, Henry Chinaski

P.S. If anyone needs me, I’ll be working on my AI-generated autobiography. Chapter One: “The Night I Definitely Didn’t Make Out with a Robot at Last Call”


Source: A ’true crime’ YouTube channel’s videos got millions of views. It turns out the stories were AI-generated

Tags: ai ethics machinelearning digitalethics disruption