The guy at the laundromat two blocks from my place looked exactly like my landlord. Same build, same bald spot, same way of standing with his hands in his pockets like he was waiting for bad news. For a month I avoided the place because I thought he was checking up on me. Turned out he was a retired electrician named Phil who just wanted clean shirts.
I never told Phil about the resemblance. What would I say? You look like a man I owe money to? Some confusions are better left alone. They sort themselves out. You look closer, you see the differences. The walk is wrong. The voice is wrong. The eyes don’t carry the same specific disappointment.
But a machine doesn’t look closer. A machine doesn’t hear the voice or read the eyes. A machine gets its hundred percent and it’s done.
Jason Killinger was placing bets at a casino in Reno when the system flagged him. A hundred percent match, the cameras said, with some other man who’d been banned from the floor. Not ninety-eight. Not probable. A hundred. The machine was certain, and certainty is contagious.
Casino security pulled him aside. A cop named Richard Jager showed up and placed Killinger under arrest. Told him he was using a fake ID. Killinger said he had three other forms of identification in his wallet. Three. The cop didn’t want to see them. Why would he? The machine had already decided. Checking would have been an act of doubt, and doubt is the thing they’re trying to engineer out of the process.
Twelve hours. That’s how long they held an innocent man on the word of a camera. Twelve hours in a cell because software said his face was someone else’s face, and a cop decided that was enough to skip the part where you actually verify anything.
Now Killinger is suing. Not just the cop — the whole city of Reno. And his lawyers are claiming something that should make everyone sit up: this wasn’t a one-off. This wasn’t a rogue officer having a bad day. This was, they say, a widespread custom and practice involving hundreds of municipal employees making thousands of arrests in the same manner over a period of years.
Thousands.
I keep turning that word over. Thousands of people arrested because a machine said they were someone they weren’t, and nobody along the chain stopped to ask the most basic question a human being can ask: are you sure?
There was a grandmother in Fargo last year. Cops used a generative AI system to generate investigative leads — that’s the euphemism, “generate leads,” like the machine is some kind of digital bloodhound sniffing toward the truth. The system flagged her as the perpetrator of ATM fraud. She spent over six months in jail. Bank records later showed she was twelve hundred miles away at the time of the crime. Twelve hundred miles. You’d think distance would be an alibi even a machine could understand.
But that’s the thing about these systems. They don’t understand anything. They match patterns. They produce confidence scores that look like certainty but are really just math pretending to have opinions. And then humans — the ones who are supposed to provide the judgment, the context, the doubt — humans look at that confidence score and they relax. The hard part is done. The machine has spoken.
I worked for the postal service for years. Sorting mail. There were machines for that too, even then. OCR readers that could parse an address in milliseconds. But when the machine kicked a letter into the reject bin, you know what happened? A person looked at it. A person with eyes and hands and the basic ability to notice that the smudge the machine couldn’t read was a seven, not a one. The machine was a tool. The person was the decision.
Somewhere along the way we flipped that. The machine became the decision and the person became the rubber stamp. Officer Jager didn’t arrest Jason Killinger. The camera did. Jager just provided the handcuffs.
The lawsuit says Reno failed to train its officers on the legal use of AI facial recognition. And maybe that’s true. Maybe they needed a seminar, a PowerPoint, a quiz at the end with a passing score of seventy percent. But the problem is older than training. The problem is that we’ve built a culture that trusts output over observation. We see a number on a screen and we mistake it for truth. A hundred percent match. Ninety-nine percent confidence. These aren’t facts. They’re suggestions from a system that has never looked anyone in the eye.
If Killinger wins — and I hope the stubborn bastard does — it could set a precedent. Cities might have to admit that their shiny surveillance toys come with a cost, and that cost gets paid by the people standing in front of the cameras, not the people who bought them.
But precedents are funny things. They only matter if someone follows them. And right now the trend is running the other direction. More cameras. More algorithms. More confidence scores replacing the messy, unreliable, entirely human act of paying attention.
Phil from the laundromat died last winter. Heart attack, I heard. I found out because his machine was empty on a Tuesday and it was never empty on a Tuesday. That’s the kind of thing you notice when you’re paying attention. No algorithm would have flagged it. No camera would have caught it. Just a missing man and an empty drum, and someone who knew the difference.
Source: Man Suing City After AI Camera Flags Him For Wrongful Arrest