Tomorrow's tech news, today's hangover. (about)


Feb. 9, 2026

The Foundation Is Made of Ghosts



The dentist’s waiting room had a television mounted in the corner, muted, captions on. Some morning show. A woman in a blazer was talking about the future of AI. The captions couldn’t keep up — words kept disappearing mid-sentence, leaving gaps where meaning should have been.

I’ve been thinking about ghosts.

Not the kind that rattle chains or haunt old houses. The kind that sit in villages in Jharkhand, India, balancing laptops on mud slabs built into their walls, watching videos of women being pinned down by groups of men. Eight hundred of them a day. The videos, not the women. Though maybe the women too. Who’s counting?

Her name is Monsumi Murmu. She’s twenty-six. She moderates content for a global technology company. That’s what they call it. Moderating. Like she’s adjusting the thermostat.

What she actually does is watch the worst things humans do to each other — the rapes, the murders, the abuse — so that the machines can learn what bad looks like. So that when you ask ChatGPT for meal prep ideas, it doesn’t start describing how to dismember a body.

You’re welcome.

“The first few months, I couldn’t sleep,” she says. “I would close my eyes and still see the screen loading.”

The images followed her into dreams: fatal accidents, family members dying, sexual violence she couldn’t stop or escape. Her mother would wake and sit with her on those nights. I try to imagine that — a mother who doesn’t understand the internet, sitting in the dark with a daughter who can’t unsee what the internet showed her.

Now? “By the end, you don’t feel disturbed — you feel blank.”

Blank.

That’s the goal, apparently. The machine needs to understand horror, and the only way to teach it is to feed it human horror, processed through human eyes, labeled by human hands. The cheapest human eyes and hands belong to women in rural India who need two hundred sixty pounds a month and can’t afford to ask questions.

There’s another woman, Raina Singh, twenty-four. She wanted to be a teacher. Took a data annotation job instead — the pay was decent, the description was vague. She figured she’d flag some spam, identify some scams, save up for teaching.

Then six months in, they moved her to a new project. Child sexual abuse material.

When she complained to her manager, he told her: “This is God’s work — you’re keeping children safe.”

God’s work.

Let that sit for a moment. A young woman in a bedroom in Bareilly, watching videos of children being abused, for two hundred sixty pounds a month, and her manager invokes the divine. This is God’s work. The company won’t pay for therapy, won’t warn you what you’re signing up for, won’t even acknowledge the work is hard enough to hurt you — but God, apparently, is on the clock.

The next project was categorizing pornography. Hour after hour after hour.

“The idea of sex started to disgust me,” she says. “Sometimes, when I’m with my partner, I feel like a stranger in my own body. I want closeness, but my mind keeps pulling away.”

The companies, when asked, say the work isn’t demanding enough to require mental healthcare. Only two of eight provided any psychological support. These are the same companies whose CEOs give keynotes about AI safety, who publish responsible AI frameworks, who hire ethics teams to write mission statements about values and impact.

The values stop at the edge of the spreadsheet.

I keep thinking about the phrase “ghost workers.” That’s what researchers call them. Invisible labor. The hands behind the curtain. The people who make the magic trick work while the audience applauds the magician.

Seventy thousand of them in India by 2021. Probably more now. Eighty percent from rural or marginalized backgrounds — Dalit, Adivasi, first-generation graduates trying to climb out of agricultural labor and mining. For them, watching other people’s nightmares on a laptop is a step up.

And they can’t talk about it. NDAs everywhere. Strict non-disclosure agreements that bar them from telling their families, their friends, anyone what they spend their days watching. Murmu hid the truth because she feared being forced into marriage. The silence protects the company. The silence eats her alive.

There’s a researcher named Milagros Miceli who says content moderation belongs in the category of dangerous work, comparable to any lethal industry. Lethal. Not because the videos can reach through the screen and physically hurt you — though in a way they do, they burrow into your dreams, they hollow you out until blank is the best you can hope for — but because the psychological damage is real and lasting and nobody’s counting the bodies.

“There may be moderators who escape psychological harm,” she says. “But I’ve yet to see evidence of that.”

We talk a lot about what AI might do to us in the future. The superintelligence. The robots. The paperclip maximizer that accidentally destroys the world. It’s all very abstract and philosophical, the kind of thing you can discuss over wine at a conference without getting your hands dirty.

But the damage is already here. It’s just happening to people we don’t see. Women on verandas in Jharkhand, finding places where the mobile signal holds. Graduates in bedrooms in Bareilly, learning things about human nature they never asked to know.

The AI safety crowd worries about alignment — making sure the machine does what we want. They worry about value loading and corrigibility and all these fancy words for keeping the genie in the bottle.

Nobody’s asking who got burned pulling the genie out in the first place.

Murmu says she goes for long walks in the forest now. She sits under the open sky and tries to notice the quiet around her. She collects mineral stones. She paints geometric patterns on the walls of her house.

“I don’t know if it really fixes anything,” she says. “But I feel a little better.”

Four months left on her contract. The specter of unemployment worries her more than the work itself. That’s the trap: the horror is bearable because the alternative is worse. You can endure anything if you’re afraid enough of nothing.

The television in the dentist’s office is still on. Someone’s talking about the future. Captions keep disappearing, leaving white space where words should be. I suppose that’s fitting.

Somewhere in India, right now, a woman is watching a video of something that will follow her into her sleep tonight. She’s doing it so your chatbot understands context. So some billionaire can stand on stage in San Francisco and talk about building a better world.

The world is being built. The foundation is made of ghosts.


Source: The Guardian

View all posts →