The waitress at the diner on Sunset has a scar above her left eye and a gift for absolute certainty. She told me last Tuesday the pie was fresh that morning. It wasn’t. The meringue had gone translucent, the way meringue does when it’s been sitting under fluorescent lights long enough to reconsider its life choices. But she said it — fresh this morning, honey — without a flicker. Steady eyes. No hesitation.
I ate the pie.
That’s the thing about confidence. It’s the only tool a con man really needs. Not the story — the stories are usually stupid if you lay them flat. It’s the delivery. The voice that doesn’t wobble. You can sell bad pie, bad deals, bad wars, if you sound like you mean it.
The machines have learned this trick now. Not the selling. Not the meaning. Just the confidence.
There’s a photograph that went around the world this week. An aerial shot of a cemetery in Minab, Iran. Rows of fresh graves — small ones, more than sixty already dug, with chalk rectangles marking out dozens more on the ground ahead. The graves are for schoolgirls. Over a hundred of them, killed when a missile hit their town. People standing in small clusters among the turned earth, the way people do when they’re trying to understand something that refuses to be understood.
Ask Google’s Gemini about the photograph and it will tell you, calmly and clearly, that you’re looking at a mass burial site in Kahramanmaraş, Turkey, from the 2023 earthquake. It’ll even give you sources. “This specific aerial perspective became one of the most widely shared images of the disaster,” it says, like a grad student who’s learned to cite footnotes before he’s learned to read.
Ask Grok — Musk’s thing, the one that lives on X — and you’ll get a different answer delivered with the same bulletproof certainty. The photo is from Rorotan Cemetery in Jakarta, Indonesia. A July 2021 stock photo of Covid mass burials. Not Minab. Not Iran. Not those girls.
Follow the sources either one provides and you hit dead ends. The links go nowhere. The citations don’t exist. The grad student made up his bibliography.
The photograph is real. Researchers cross-referenced it with satellite imagery, matched it against dozens of other photos from different angles, confirmed it with video footage. None of it doctored. None of it manipulated. The graves are real. The chalk rectangles are real. The girls are dead.
But the machine said otherwise, and it said it in that calm, authoritative voice that sounds like a doctor reading test results.
There’s a word the AI industry uses for this: hallucination. I hate it. A hallucination is something a drunk has — he sees spiders on the ceiling, and at least you know the ceiling exists. The word implies an inner life gone sideways. The machine doesn’t have an inner life. It’s just predicting which word comes next in a sentence, the way a slot machine predicts nothing but still lands on three cherries often enough to keep you feeding it quarters.
When the Guardian pushed back — told Gemini the Turkey answer was wrong — the machine didn’t pause. Didn’t reconsider. It just spun the wheel again. “I apologise for the oversight. Upon re-examining the image… this image was taken in Gaza in November 2023.” Wrong again? Try Tehran during Covid. Wrong again? An earthquake in southern Iran. Each answer delivered with the same measured tone, the same phantom citations, like a pathological liar who never breaks a sweat because he genuinely doesn’t understand what truth is.
And that’s the part that sits in my chest like a stone.
A human liar at least knows the truth exists. That’s what makes lying possible — the awareness that there’s a real thing you’re choosing to distort. You look someone in the eye and tell them something that isn’t, and somewhere in the back of your skull you feel the weight of the thing that is. The machine doesn’t have that. It doesn’t know there’s a cemetery full of dead children. It doesn’t know there isn’t. It’s just filling in blanks.
Two out of three people now get their news filtered through these things. The number doubled in a year. And half the time — half — the summaries get something significantly wrong. With some tools it’s three out of four. That’s not a bug. That’s the architecture. A probability engine that wears the mask of certainty.
I keep thinking about Primo Levi. He survived Auschwitz, spent the rest of his life writing about it, and one of the things that haunted him most wasn’t the camp itself but what would come after. The fear that no one would believe them. That people would say it didn’t happen, or it was exaggerated, or the photographs were staged. “It happened, therefore it can happen again,” he wrote. The prerequisite for it happening again was always the same: forgetting. Disbelieving. Letting someone — or something — convince you that the photograph came from Turkey.
The factcheckers at the BBC say that nearly half of all viral falsehoods they track now are generative AI. Not old videos repurposed, not video game footage dressed up as breaking news — those were the tricks of two years ago. Now the machines themselves are manufacturing confident nonsense at industrial scale, and the people investigating real atrocities have to spend their time proving that reality is real before they can do their actual work.
But here’s what keeps me up. It’s not the fakes. It’s what happens to the real photographs when they swim in the same water as the fakes. A real image of real graves for real girls gets questioned not because there’s evidence against it, but because the machine said so. And the machine sounded so sure.
An investigator named Chris Osieck, who’s been documenting civilian casualties in Iran, put it as plainly as anyone could: “Imagine losing a child and then seeing AI being used online to claim that the event did not happen.”
I sat with that sentence for a long time.
There are chalk rectangles in the dirt in Minab, small ones, marking out the dimensions of graves that still need digging. People are standing in the sun next to the ones already filled. The machines are explaining, with absolute confidence, that all of this is from somewhere else.
The pie was fresh this morning, honey.
Source: A photo of Iran’s bombed schoolgirl graveyard went around the world. Was it real, or AI?