The woman at the laundromat was folding a flag. Not a bedsheet — an actual American flag, creased and faded, the kind you see at yard sales in towns where the factories left twenty years ago. She folded it into a triangle like they taught her, tucked the last corner in, and set it on top of her basket next to a box of Tide.
I watched her for a while because I had nothing else to do. My clothes were in the dryer and the dryer was lying about having eight minutes left. I thought about how she handled that flag like it meant something, and then I thought about Jessica Foster.
Jessica Foster had a million followers on Instagram. Blond. Military uniform. High heels on a tarmac, walking next to the President of the United States. She sat on bunk beds in barracks, feet up on desks, the kind of woman who looked like she’d been designed by a committee of men who’d never served but watched a lot of recruitment videos.
Jessica Foster didn’t exist.
She was pixels. Generated by a machine. Every photo, every smile, every perfectly angled boot was conjured from nothing by someone who understood that loneliness has a price point.
And that price point is forty dollars for pictures of feet on OnlyFans.
A million people followed her. Liked her posts. Asked her questions she’d never answer because there was nobody home. “Why do you NEVER reply?” one guy wrote, and I had to put my phone down and stare at the ceiling for a while because that sentence contained more human desperation than most novels I’ve read.
Here’s the thing that won’t leave me alone. The researchers studying this — and we’ve reached the point where we need academics to catalogue fake women in fake uniforms — they say people knew. The military badges were gibberish. There’s no reason a blond in stilettos would be walking a tarmac with Trump and Nicolás Maduro. None of it survived five seconds of scrutiny.
They followed her anyway.
A professor at Purdue named Schiff has been tracking political deepfakes. His team found over a thousand English-language posts featuring fake images of politicians since the start of 2025. In the eight years before that, they’d counted thirteen hundred total. One year matched the previous eight and kept going.
But the numbers aren’t what sit with me. It’s what Schiff said about why it works:
“People aren’t necessarily looking for things that are real. They are looking for things that represent their beliefs.”
We’re not being tricked. We’re cooperating with the trick. The deepfake doesn’t have to fool you — it just has to say what you already wanted to hear, show what you already wanted to see. The lie doesn’t need to be good. It needs to be comfortable.
We went from “seeing is believing” to “believing is seeing,” and the reversal happened so quietly that most people missed it.
Orwell got close with doublethink — holding two contradictory beliefs at once. But this is stranger. This is knowing the image is fake and feeling it’s true at the same time, and choosing the feeling. Because the feeling is cheaper than the alternative, and the alternative is doubt. Doubt is exhausting. Doubt doesn’t get likes. Doubt doesn’t build a million followers for a woman who was never born.
There’s an AI-generated soldier in Iran who says “Habibi, come to Iran” in videos that rack up views. The giveaway is obvious — Iran prohibits women from combat roles. Doesn’t matter. The videos spread because they scratch an itch that reality can’t reach. California’s governor shares deepfakes of Trump grinning at a hologram of Jeffrey Epstein. The White House has shared at least eighteen deepfakes since 2024. Everyone’s doing it. Nobody’s embarrassed. The lying has become bipartisan, which I suppose is the first thing both sides have agreed on in years.
They built a technical standard to fix this — cryptographic signatures embedded in images, proving where they came from and whether a machine made them. The platforms promised to label AI content. LinkedIn catches about two thirds. Instagram catches fifteen out of a hundred and five.
Fifteen out of a hundred and five. Not a failure of technology. A failure of giving a damn. Because fake engagement is still engagement, and engagement is still money, and money has never once cared whether Jessica Foster has a pulse.
The researchers worry about what’s coming — coordinated networks of synthetic people, manufacturing consensus in communities, fabricating agreement. A troll farm without the trolls. No one to underpay, no one to deprogram, no one to feel guilt at three in the morning about the things they typed for money. Just machines talking to machines about what machines want you to believe, and you scrolling through it at midnight thinking you’re part of a conversation.
I keep thinking about Dostoyevsky. There’s a line in The Brothers Karamazov — “Above all, don’t lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him.” He wrote that in 1880. No algorithms. No feeds. No generative AI. Just a Russian who understood that the machinery of self-deception has always been the most advanced technology we possess. The deepfakes didn’t invent anything. They just automated what we were already doing by hand.
Back at the laundromat, the woman with the flag had gone. Her dryer was empty. Mine still said three minutes but I knew better.
I thought about that guy asking Jessica Foster why she never replied. And I realized the saddest part wasn’t that he believed she was real. The saddest part was that he probably didn’t — and he asked anyway, because even a fake woman ignoring you feels more like company than the silence of a room where no one’s pretending.
We used to worry the machines would get smart enough to fool us. Turns out they didn’t need to.
The dryer lied for another six minutes. I let it.