Ninety-Six Percent

Mar. 15, 2026

Ninety-six percent. That’s how sure the machine was that it was looking at a horse.

It was a camel.

I don’t know if you’ve ever seen a camel. They’re not subtle animals. They’ve got humps. Distinctive humps. The kind of identifying feature that a five-year-old could spot from across a parking lot. But the computer vision model — the kind of technology we’re strapping to missiles and border checkpoints and police cruisers — looked at a photograph of a Bedouin camel from a hundred years ago and said, with ninety-six percent confidence: horse.

There’s an artist in Houston right now, a Saudi woman named Nouf Aljowaysir, who took old photographs from a British explorer named Gertrude Bell — gorgeous, century-old images from across the Middle East — and fed them into AI recognition systems. She printed the results right on top of the images. Green outlines. Percentages. The machine’s best guesses.

A cluster of buildings surrounded by water: 56.98% likely to be a submarine. In the desert. In 1910.

Farmers identified as troops. Children flagged as soldiers — sometimes in the nineties. A reed hut, the kind a family sleeps in, tagged as a bunker with 96.01% certainty. The machine looked at the Middle East and saw what it was trained to see: targets.

I used to work at the post office. Sorting mail. You’d think it’s simple — read the address, put it in the right slot. But you learn that reading isn’t the same as seeing. People write addresses in every direction, in every hand, in ink that’s smeared or faded or written by someone whose first language isn’t yours. You get good at it because if you screw up, someone’s mortgage payment goes to the wrong house.

That was letters. We’re talking about bombs now.

An American missile strike hit an Iranian school in the opening hours of whatever this new war is. Nearly two hundred kids and teachers. They say it was outdated intelligence, not bad AI — not yet. But “not yet” is doing a lot of heavy lifting in that sentence. The military is building toward autonomous targeting at full speed. Kill decisions made by algorithms that think a camel is a horse and a fishing village is a naval base.

At what percentage does the math justify the massacre?

Somewhere else in the exhibition, there’s a sculpture by Trevor Paglen. He recreated something from the 1960s called the “standard head” — a model built by a researcher named Woody Bledsoe, who also worked on nuclear weapons, because of course he did. Bledsoe wanted to build facial recognition software. He needed a baseline. A normal human face.

So he used his own. White. Male. Young. The default.

I’ve worked enough jobs with enough people to know that everyone thinks they’re the standard. The foreman at the warehouse thought a real worker was a guy who looked like him — broad, white, crew cut. The landlord on Alvarado thought a reliable tenant was a quiet Asian woman because that’s what she was. We all look in the mirror and see normal. See default. See the center of the universe looking back.

The difference is, when I decided my neighbor didn’t look trustworthy, nobody died. When Bledsoe’s mirror became the template for facial recognition — funded by the CIA, refined over thirty years at the University of Texas, baked into systems used by police and immigration and God knows who else — the consequences scaled. Your cousin unlocks your phone because the machine can’t tell you apart. An ICE agent gets a match that isn’t a match. Someone gets detained, deported, vanished. Not because they did anything wrong, but because their face deviated too far from a dead man’s idea of normal.

What I keep coming back to is the confidence scores. The decimal points. 96.01%. The machine doesn’t hedge. Doesn’t say “I think” or “maybe.” It delivers its wrong answer with the calm authority of a surgeon, down to the hundredth of a percent, as if precision and accuracy are the same thing.

They’re not. I can tell you with great precision that the sun rises at 6:47 AM tomorrow. Precise and dead wrong if I’m on the wrong continent. Precision is just confidence holding a ruler. It doesn’t mean you’re measuring the right thing.

A general sees 95% on a screen and sees a green light. That’s all anyone in that chain ever wanted — not accuracy, not truth, just a number high enough to sleep at night. A number to point to when the cameras roll and someone has to explain why a school full of kids looked like a military compound. Nobody’s accountable. Blame the algorithm. Abdicate responsibility. The machine said so, sir. Ninety-six percent, sir. With a decimal point and everything.

The artists in Houston are doing what artists have always done — standing in front of the stampede and saying look. Look at what the machine sees. Look at what it doesn’t. The curator said something that stuck with me: “Photography has always been mediated. It has always been manipulated. But it came from a known reference point. At this point, we cannot assume that anymore.”

We can’t assume the photograph is real. Can’t assume the face is who the machine says it is. Can’t assume the target is a target. And yet assumption is all we’ve got left, because the technology moves faster than the questions, and the questions move faster than the answers, and the answers don’t matter because the contracts are signed and the drones are already in the air.

A camel. A horse. A child. A soldier.

The machine can’t tell the difference. And the people who built it don’t have to.


Source: Artists Lay Bare The Dangers And Biases Of Artificial Intelligence

Tags: ai ethics aisafety algorithms culture humanaiinteraction