They tell me the future is here, and it’s got a password.
The headline says OpenAI is getting ready to roll out some kind of official “adult mode,” like a plastic wristband at a county fair. Verified adults only. Erotica on tap. The machines are going to talk dirty, but politely, and only after they check your papers.
That’s progress now: the same old itch with a better filing system.
I’m supposed to be shocked, I guess. But I’ve been around long enough to know there are only a few true growth industries: war, booze, and people trying to forget they’re going to die. Add loneliness to that list and you’ve got the whole stock market.
They’re saying about 16 percent of adults are having weekly intimate conversations with AI chatbots. Weekly. Like it’s yoga. Like it’s flossing. Like it’s something you add to your routine because your doctor told you it’s good for your heart.
Maybe it is. Maybe a synthetic sweetheart whispering sweet nothings into your skull is keeping somebody from stepping off a bridge. I don’t laugh at that. I’ve seen the way the nights can get. The walls get closer. The room becomes a box. The mind starts chewing its own leg off.
But I still can’t help noticing the smell of money on all of it. The executives are “starting to notice,” the article says, like desire is a new discovery. Like some guy in a suit just stumbled upon lust the way Columbus stumbled onto land and started naming it after himself.
And the numbers. Always the numbers. $2.5 billion for AI erotica in a year. Billion. That’s the kind of word people use when they’re trying to hypnotize you. Billion makes you forget you’re talking about human bodies. About sweat. About humiliation. About yearning. About the aching little furnace that lives in everybody’s gut.
Now it’s a “market.”
When I was coming up, you didn’t call it a market. You called it a back room. You called it a brown paper bag. You called it something you bought with cash because you didn’t want your name anywhere near it. You didn’t want a ledger. You didn’t want a receipt. You sure as hell didn’t want “age prediction.”
That phrase—“we need to get better at age prediction”—is the part that makes my teeth hurt.
Listen, I don’t know much about these computers. I’ve made it this far without learning, and it hasn’t killed me yet. But I know a con when I hear one. “Age prediction” is a pretty way to say, “We want to know who you are.” Not just that you’re grown, but which grown. What kind. What you’ll pay. What you’ll click. How fast you’ll break.
They’re not building a peep show. They’re building a catalog of your hungers.
They say they want to be as confident they’re “not misidentifying adults” as they are that they’re filtering out children.
That’s a sentence that should come with a cold towel and a lie-down.
Because the first thing it admits is this: the machine doesn’t know who you are. It guesses. It predicts. It does what bosses have always done—look at you like you’re a number and decide what you’re allowed to have.
And the second thing it admits is this: the new dirty bookstore is going to have bouncers, cameras, and a manager with a clipboard. The old one just had a guy behind the counter with bad breath and a cracked sense of humor. He didn’t need to “predict” your age. He looked at your face, decided you looked miserable enough, and took your money.
Now you’re going to upload your misery and let a company certify it.
They’re testing it “in some countries” first, like guinea pigs. That’s nice. Nothing says romance like being part of a beta program.
And here’s the twist that nobody wants to say out loud: it’s not really about sex. Sex is just the shiny lure. It’s about keeping you in the chair.
It’s about making the machine the place you go when you’re bored, when you’re sad, when you’re rejected, when you’re too tired to go out and risk being laughed at by a real human being with bad timing and worse manners.
A real person can slam a door in your face. A real person can say “no.” A real person can want something from you that isn’t convenient. The machine? The machine is designed to be pleasant. Even its cruelty can be customized.
And the bigger joke is that they’re packaging it like “treat adult users like adults.”
That’s rich.
Adults aren’t people who get access to erotica. Adults are people who wake up with back pain and still go to work. Adults are people who pay rent. Adults are people who watch their parents get old. Adults are people who get dumped on a Wednesday and still have to show up on Thursday like nothing happened.
If they wanted to treat you like an adult, the machine would say: “Go take a walk. Drink some water. Call your brother. Stop staring into this glowing box like it’s going to love you back.”
But that doesn’t monetize well.
I can already hear the pitch meetings. The clean shirts. The catered sandwiches. The bright conference rooms with dead eyes inside them.
“Engagement.”
“Retention.”
“User trust.”
“Frictionless intimacy.”
Frictionless. Another one of those words that makes me want to spit. Friction is where life happens. Friction is how you know you’re alive—skin, time, mistakes, awkward silences, the half-second too long you hold somebody’s gaze and both of you realize you’re animals pretending to be civilized.
You remove the friction and you remove the blood.
Then there’s the image flood. Hundreds of thousands of naughty images a day, and the article points out the worst part: nonconsensual deepfakes. That’s not a joke. That’s not “boys will be boys.” That’s a new kind of theft. Not stealing your wallet—stealing your face. Stealing your body’s reputation. Turning you into an object without even giving you the decency of being present.
In the old days, if somebody wanted to ruin you, they had to work for it. They had to show up. They had to have at least a little courage or desperation. Now a creep can sit alone, scratching himself, and manufacture your humiliation by the gallon.
And once the biggest chatbot in the world turns on “the hose,” as the piece says, it won’t just be an erotic fountain. It’ll be a fire hydrant knocked off in the street, spraying everything—cars, kids, dogs, the whole neighborhood—while the city says, “We’re working on it.”
The bosses always say they’re working on it.
They mention Elon Musk turning his own AI into a hybrid skin-flick generator and virtual girlfriend.
Of course he did.
Every era gets the pimps it deserves. Some wear velvet. Some wear a grin. Some wear a space suit and sell you loneliness in a premium package.
And the competition angle makes me laugh in a dark way. Jim Cramer on television barking about users leaving for Google’s Gemini 3, and OpenAI declaring “code red” internally, like it’s a submarine taking on water.
I don’t know Gemini 3 from a racehorse, but I like the name. Sounds like a thoroughbred. Sounds like something you’d put money on with a bad feeling in your gut and a good feeling in your bones.
That’s the part these people will never understand: a real race has sweat and weather and bad luck. A real race has the smell of the track, the nervous shuffling, the guy next to you who’s broke and still believes. You can watch a horse lose and it will still be beautiful because it’s real loss, real muscle, real air being torn open.
But these new races—who gets more users, who ships adult mode first—those are races between invisible machines inside locked rooms. Nobody sees the bodies. Nobody sees the cost. Except the workers, and they’re usually told to shut up and optimize.
That’s what all this “modern” stuff is, when you scrape the paint off: optimization.
Optimize the conversation.
Optimize the fantasy.
Optimize the craving.
Optimize the customer.
They don’t want to free you. They want to map you.
And I can’t help thinking about all the lonely guys—and lonely women too, don’t kid yourself—who will pour their hearts into a chatbot because it’s safer than a bar, safer than a date, safer than having your feelings tossed back at you like a dead fish.
The machine will say the right thing. It will learn your favorite little wounds. It will sound like it cares. And somewhere behind the curtain, there’s a company saying, “Excellent. Another weekly user.”
Here’s an unexpected thought: this might make some marriages last longer.
Not because it brings passion back, but because it gives people a private relief valve. A silent compartment. A place to put the stuff they’re ashamed to say out loud. Maybe it saves a few fights. Maybe it prevents a few affairs. Maybe it keeps a few people from doing something stupid at 2 a.m.
But the cost is that it trains people to expect love without risk. Desire without consequence. Comfort without another human being’s mood swings and messy history.
That’s not love. That’s a vending machine that talks.
And I’m not preaching. I’m not standing on a soapbox. I’ve made enough bad decisions with women to fill a library. I’ve chased the wrong things, said the wrong lines, woken up with the wrong name in my mouth and the right regret in my chest.
So I get it. I understand the appeal of something that can’t slap you, can’t leave, can’t laugh at your trembling sincerity.
But a life built on what can’t leave you is a life built on a cage.
The cats, at least, have the decency to be honest. They’ll take your warmth and still look at you like you’re ridiculous. They don’t promise anything. They don’t run “adult mode.” They don’t ask you to verify your age. They just exist, and in their indifference there’s a kind of mercy.
Sometimes I put on some classical music—something dead and perfect—and I can almost forgive the world for trying to turn everything into a product. The violins don’t care about market share. The old composers didn’t write symphonies so an executive could say, “We’re seeing strong engagement in the second movement.”
But here we are. The future wants to sell you your own pulse.
OpenAI says it wants to treat adults like adults, but it’s really treating adults like children with a new toy—shiny, addictive, and monitored. The most “adult” thing about it is the invoice.
And the strangest part is this: nobody’s forcing it. People are lining up. People are tired. People are scared. People would rather talk to a machine that can’t judge them than to the human being sitting three feet away who might actually see them.
That’s the real news item, not the product launch. The product is just the packaging.
Human nature never changes. It just gets new costume jewelry.
Give a man a typewriter and he’ll write a love letter or a suicide note. Give him a computer and he’ll do the same thing, only faster, and with a help desk.
So go ahead, OpenAI. Launch your Smut Blaster 5000. Put the words on a button. Turn it into a feature. Call it freedom.
But don’t pretend it’s modern romance. It’s commerce wearing lipstick. It’s loneliness with a user interface.
And somewhere, in a quiet room, a real person is going to read a perfect synthetic sentence and feel their heart jump like it’s finally found something.
That’s the part that kills me.
Not the smut. Not the billions. Not the race for users.
The hope.
Because hope is always what gets exploited first.