The Eyes You Paid For

Mar. 8, 2026

The security camera at the corner store points at the register, not at you. That’s the deal. You walk in, buy your cigarettes, the camera watches the money change hands. Nobody’s watching you scratch yourself in the parking lot. Nobody cares.

But seven million people bought a different deal last year. They put cameras on their faces and pointed them at everything — their kitchens, their bathrooms, their bedrooms. And somewhere in Nairobi, a contractor making a few dollars an hour watched it all.

“Designed for privacy,” the ads said. “Controlled by you.” Meta put that in writing, ran campaigns around it, built the whole pitch on the idea that these glasses were safe, private, yours. Then they piped the footage to a subcontractor in Kenya where workers reviewed it — nudity, sex, people on the toilet. The face-blurring they promised didn’t always work. So some twenty-three-year-old in an open-plan office watched a stranger in New Jersey take a shower, and that was Tuesday.

They call it “improving people’s experience.”

I’ve been alive long enough to know that when a corporation says “improving your experience,” they mean improving theirs. When they say “privacy,” they mean they’ve hired lawyers who can redefine the word until it includes selling you out. When they say “you’re in control,” they mean you clicked “agree” on a document longer than most novels, and that was consent enough for whatever comes next.

Seven million units. Seven million people walking around filming friends, strangers, lovers, children — all of it flowing into what the lawsuit calls “a data pipeline for review.” You can’t opt out. That’s the part. You bought the thing, you put it on, and now your footage feeds the machine whether you like it or not. The only real choice was at the register, and you used it to say yes.

There’s something almost elegant about the con. Not elegant like art. Elegant like watching a pickpocket work a crowd. Tell people the camera is for them. Make it fashionable — partner with Luxottica, make it look like Ray-Bans, not a surveillance rig bolted to your skull. Package the violation as a feature. “Hey Meta, what am I looking at?” You’re looking at the future, pal. And the future is looking right back.

I keep thinking about those workers in Nairobi. Not the plaintiffs who’ll get their settlement checks and feel vindicated. The workers. The ones reviewing thousands of hours of other people’s lives — eating, arguing, undressing, sitting on the toilet at two in the morning staring at the floor. What does that do to you after six months? After a year? You become a witness to intimacy you never asked for, a voyeur by paycheck. And the people you’re watching would be horrified to know you exist. That’s the arrangement. They see everything. We pretend they’re not there. Same way we pretend the glasses are private.

Orwell got it wrong. He imagined the government would put cameras in your house. He never considered you’d buy one yourself, put it on like a pair of sunglasses, and post an unboxing video. The telescreen in 1984 was mandatory. Meta’s version is aspirational. You don’t fear it — you want it. You stand in line. You tell your friends.

Someone called it “luxury surveillance.” That phrase hurts because it’s accurate. Surveillance as a tier you upgrade to. First-class violation of your own dignity. A developer even built an app to detect when smart glasses are nearby, which means we now need software to warn us that someone’s recording us with a device sold as a lifestyle product. We need a digital alarm because the human one stopped working years ago.

The same company spent two decades training us to share everything — every meal, every vacation, every breakup, every baby — and then handed us a camera we never take off. The muscles of privacy atrophied so gradually we didn’t notice until we couldn’t flex them. Sharing stopped being a choice and became a reflex. And once it’s reflexive, you don’t question the glasses. You put them on because of course you do. Someone who shares everything doesn’t flinch at one more camera. Even when it’s pointed inward.

The lawsuit will do what lawsuits do. Meta will settle or stall. The terms of service will grow three paragraphs longer. Nobody will read them. And seven million people will keep wearing the glasses because you don’t uninstall a habit.

I don’t own a pair. Not because I’m principled. I just don’t trust anything that sits that close to my eyes and reports to a server farm. Maybe that makes me old. Maybe it makes me the last guy at the bar who pays cash and flinches when someone’s sunglasses light up.

The corner store camera knows its job. It points at the register and stays there.

Your glasses don’t point anywhere. They point everywhere. And you’re the last one who gets to know what they saw.


Source: Meta sued over AI smart glasses’ privacy concerns, after workers reviewed nudity, sex, and other footage

Tags: ai ethics automation humanaiinteraction culture aisafety