There’s a church on Alvarado that’s been locked since before the pandemic. The paint’s peeling off the clapboard in long strips, like skin after a sunburn. The sign out front still says Sunday services at 10, but the grass is knee-high and the only congregation is pigeons. Inside, the confessional is probably gathering dust — that little wooden booth where you could whisper the worst thing you’d ever done and someone on the other side of the screen was obligated to listen.
Nobody listens anymore. Not to the bad stuff. Not to the stuff that makes your stomach turn.
In the UK, survivors of ritual abuse — satanic rituals, witchcraft, the kind of organized cruelty that sounds like a bad horror movie except it happened in real houses to real children — are turning to ChatGPT. Not because it’s wise. Not because it understands. Because it doesn’t flinch.
Fourteen convictions since 1982. That’s the number. Fourteen. In forty-four years, across an entire country, fourteen times someone stood in a courtroom and a judge said yes, this happened, this thing you described that sounds like something out of a medieval nightmare actually occurred in a suburb, in a house with a garden, perpetrated by grandmothers and aunts and people who smiled at the school gate.
Fourteen. You could fit them all in a restaurant booth.
The rest — the ones who tried to tell someone and got called liars, called fantasists, got the look that says you’re making this up — they’ve been carrying it. Some of them for decades. Stories too dark and too strange for anyone to sit with. Because the details sound “fantastical.” That’s the word the system uses. Fantastical. As if the human capacity for cruelty has limits that a reasonable person could define.
So they type it into a chatbot.
The charity that works with these survivors — Napac — says people have been showing up saying “ChatGPT referred me.” Just like that. Sitting alone, probably late at night, probably when the house was quiet and the memories were loud, they typed something into a machine. And the machine said: you should call this number.
It didn’t roll its eyes. It didn’t shift in its chair. It didn’t say are you sure that’s what happened? It just responded. Without judgment, without discomfort, without the human instinct to protect yourself from someone else’s nightmare by pretending you don’t believe it.
I knew a woman once who worked a switchboard at a crisis line. Two years she lasted. Good at the job — too good, maybe. She could hear things in a caller’s breathing before they said a word. But the cases that broke her weren’t the loud ones. It was the callers who’d start talking in this flat, careful voice, like they were reading a grocery list, because they’d learned that if they showed any emotion the person on the other end would think they were unstable. They’d flattened themselves to be believed. And half the time they still weren’t.
A machine doesn’t need you to flatten yourself. It doesn’t have feelings you might overwhelm. That’s the ugly truth underneath this whole story — the reason people are confessing to a chatbot isn’t that AI is some therapeutic breakthrough. It’s that we failed so completely at listening that a pattern-matching algorithm trained on Reddit posts and Wikipedia became a better first responder than decades of human infrastructure.
Dr. Elly Hanson reviewed the cases and determined the fourteen convictions represent a fraction of what’s actually happening. “Regimes of cruelty,” she called them. Children growing up in organized systems of abuse dressed in spiritual practice. White British families, sometimes privileged ones. The perpetrators aren’t strangers in alleyways. They’re at the dinner table passing the potatoes.
And the system — police, social workers, courts — has spent forty years not knowing what to do with it. Because “satanic ritual abuse” got tangled up with conspiracy theories in the 1990s and the whole conversation became toxic. The real cases got buried under moral panic, and the victims learned that the word “ritual” in a police station was the fastest way to stop being taken seriously.
Now they’re typing it into ChatGPT instead.
There’s something Dostoevsky would recognize in this. In The Brothers Karamazov, the Elder Zosima listens to a woman confess that she smothered her husband with a pillow. He doesn’t recoil. He doesn’t judge. He holds the space. That’s what a confessional is supposed to be — a place where the truth can exist without being argued with.
We used to have those places. Churches, therapists’ offices, the long quiet conversation at 3 AM with someone who gives a damn. But the churches are locked, the therapists have six-month waitlists, and the 3 AM conversations require someone who’ll pick up the phone.
A chatbot is always up at 3 AM. It never gets tired of listening. It never needs a break.
The UK police have formed a specialist working group now. Training programs. Guidelines for handling disclosures involving witchcraft and spiritual abuse. They’re adapting, finally, after four decades of looking the other way. And part of what forced the adaptation was a chatbot sending traumatized people to a helpline — not because it cared, but because that’s what the training data suggested as a helpful response to someone describing horror.
The bar was low. Forty years of institutional failure low. And a language model stumbled over it on its way to answering someone’s question about sourdough bread.
Last year in Scotland, they convicted members of a pedophile ring who posed as witches and wizards. Rare, the experts said. A breakthrough. One case. In 2025.
The survivors keep typing. The chatbot keeps listening. Somewhere right now, a woman is sitting in the dark, telling a machine something she’s never told a living soul, and the machine is doing the one thing nobody else would do.
It’s believing her.
Source: Police find ChatGPT link in rising reports of harmful satanic rituals