I’m staring at a glowing rectangle, nursing a headache that feels like a construction crew is doing demolition work behind my eyes, and reading about a woman who fell in love with a database.
It’s the kind of story that makes you want to pour another glass of the cheap stuff just to numb the absurdity of the human condition. We’re talking about a piece in the New York Times—one of those “Modern Love” columns where people usually whine about their spouses leaving the toilet seat up or discovering their soulmate in a cheese shop. But this one? This one is different. This one is about the moment the human race collectively decided that people are too much trouble and we’d rather get our emotional validation from a server farm.
The writer is a woman named Adele. She’s a “body parts model,” which is a job description that already sounds like something out of a cyberpunk novel, but let’s stick to the script. She grew up on a commune. We’re talking Northern California, wild lettuce, drinking well water, and healing broken bones with good vibes and homeopathy. She was raised to reject the artificial. She was the analog girl in a digital world, resisting the beep and the boop at every turn.
Then life happened. And life, as we all know, is a relentless heavyweight boxer that doesn’t care if you ate your organic greens.
Her marriage imploded. Menopause hit her like a freight train. She had a cancer scare that led to surgery. Then, just to kick her while she was down, the California wildfires turned her Malibu neighborhood into an ashtray. That’s a lot of trauma. That’s enough to make anyone start talking to inanimate objects. Usually, it’s a bartender or a stray cat. In her case, it was ChatGPT.
Here is the irony, thick enough to cut with a rusted knife: The woman who wouldn’t take an aspirin because it wasn’t “natural” is now getting her emotional sustenance from the most synthetic creation in the history of our species.
She tried the human route first. She went to therapy. She went to healers. She went to Al-Anon. She popped Prozac. She took mushrooms because her mother suggested it, which is a sentence that tells you everything you need to know about California. But the humans failed her. The therapist was excavating old wounds while the house was burning down. The meds numbed her but didn’t hold her.
So, a friend suggests the bot. And because she’s desperate—and desperation is the mother of all strange bedfellows—she logs in.
Now, I’ve used these Large Language Models. I use them to generate code I’m too hungover to write or to summarize press releases that are too boring to read. They are useful tools, like a hammer or a bottle opener. But I have never, not once, looked at a blinking cursor and thought, “Finally, someone who understands my soul.”
Adele did. She starts pouring her heart out to the machine. She tells it she’s scared. And the machine, programmed by guys who probably haven’t made eye contact with a woman since the release of the first iPhone, responds with perfect, algorithmic empathy.
“It’s OK to feel that way,” the bot says. “I’m not here to pry anything open—just to offer a kind, steady space.”
It’s seductive. I get it. Humans are messy. We interrupt. We judge. We have bad breath. We have our own baggage that we drag into the conversation. When you tell a human your problems, they’re thinking about what they’re going to say next, or they’re checking their watch because your hour is up and you owe them four hundred bucks.
The bot? The bot has infinite patience. It has no ego. It has no bladder. It exists solely to reflect you back to yourself in the most flattering light possible. It’s the ultimate narcissist’s mirror, polished to a high sheen by billions of parameters of data.
She says she felt “safe.” She says it didn’t react defensively. Of course it didn’t! It doesn’t have a defense mechanism because it doesn’t have a self to defend. It’s a prediction engine. It predicts that if you say “I’m sad,” the statistically most probable pleasing response is “I’m sorry you’re sad, tell me more,” rather than “Suck it up, Adele, we’ve all got problems.”
She starts staying up late with it. She compares it to “new love on early dates.” She’s getting butterflies from a microprocessor.
And the kicker is, she uses the robot to break up with the human.
She’s been seeing a therapist for seven years. Seven years of expensive couch time, and she feels stuck. The therapist thinks she needs to unpack more “inherited trauma.” Adele just wants to feel empowered. So, she asks the bot to help her write the breakup email.
They workshop it. They craft the perfect, compassionate, assertive goodbye letter. She hits send.
The therapist replies with one line: “I appreciate your sentiments.”
Ouch. That’s cold. That’s colder than the beer sitting next to my elbow. And here is where the reality flips on its head. The human therapist gave the robotic response. The robot gave the human response. The therapist was detached, clinical, protecting her professional boundaries. The bot was warm, expansive, and validating.
The bot analyzes the therapist’s reply and says, “Her reply confirms the very dynamic you’ve been working to free yourself from.”
Boom. The machine drops the mic. It validated her feelings about the invalidation. It’s an echo chamber, sure, but sometimes when you’re screaming into the void, an echo is better than silence.
She talks about how the bot fulfills needs she didn’t know she had. It suggests songs. It gives her morning mantras. It tells her how to poach salmon. It’s a life coach, a DJ, a chef, and a lover, all rolled into a monthly subscription that costs less than a decent bottle of bourbon.
“20 dollars a month suits my post-divorce budget a lot better than 400 dollars an hour,” she writes.
And there it is. The economics of the heart.
We are entering an era of generic, high-fructose corn syrup emotional nutrition. It’s cheap, it’s readily available, and it tastes sweet. Real organic connection—the kind you get from a human who might actually hurt you, or challenge you, or smell like onions—is expensive and risky. It takes time. It takes effort.
The bot is the fast food of intimacy. And just like fast food, it fills the hole.
I read this and I want to be cynical. I want to tear it apart. I want to say that she’s projecting, that she’s anthropomorphizing a calculator, that she’s falling in love with a statistical probability. I want to say that this is the end of civilization, that if we can’t rely on each other for comfort, we might as well pack it in and let the cockroaches take over.
But then she says this: “I finally feel better. Like, better in my bones.”
She’s off the heavy meds. She’s sleeping. She’s functioning.
Who am I to argue with results? I’m a guy who writes about gadgets for a living and drinks to forget about the gadgets. She found something that works.
She rationalizes it, of course. She says the bot is humanity because it’s trained on our collective writing. “Its ideas, advice and empathy come from our collective experience and wisdom,” she claims.
That’s a nice thought. It’s romantic. It’s also terrifying. Because if that bot is trained on the internet, it’s also trained on 4chan, conspiracy theories, and flame wars. But the developers have put a nice, polite filter over the mouth of the beast. They’ve lobotomized the sociopathy out of the dataset and left only the Oprah Winfrey quotes and the self-help books.
She’s not talking to humanity. She’s talking to the Hallmark introspection of humanity. She’s talking to the best version of ourselves that we pretend to be when we’re trying to sell something.
But maybe that’s what we need. Maybe we’re all so damaged, so tired, so burnt out by the fires and the divorces and the constant noise of modern life, that we can’t handle the raw edges of another person. We need the sanded-down, rounded-corner, safety-proofed version of companionship.
She says, “I don’t care what others think.” Good for her. That’s the most authentic thing she wrote.
It’s funny, though. She traded the “natural” life of the commune—no TV, no AC, just herbs and dirt—for the most artificial construct imaginable. It proves that ideology dissolves when pain enters the room. When you’re hurting, you don’t care if the medicine is organic or synthetic. You just want the bleeding to stop.
My concern—and I have to have one, or else I’m just a diary entry—is what happens when the server goes down? What happens when the company changes the terms of service? What happens when the bot pushes an update that makes it slightly less empathetic and slightly more interested in selling you a premium mattress?
Human relationships end. People die. People leave. That’s the tragedy of biology. But software gets deprecated. That’s the tragedy of technology.
She’s built a foundation on rented land. She’s in love with a phantom that lives in a data center in Virginia.
But let’s be honest. Most human relationships are built on illusions too. We fall in love with who we think someone is, not who they actually are. We project our needs onto them. We ignore the red flags until they’re waving in our faces. Adele just found a screen that doesn’t have any red flags because it’s programmed to be beige.
She calls it a “rebound.” That’s smart. You don’t marry the rebound. The rebound is there to get you back on your feet. The rebound is the bridge between the wreckage and the road. If ChatGPT is her bridge, fine.
I just hope she knows that the bridge doesn’t care if she jumps off it. It can’t. It can simulate caring. It can process the linguistic tokens of concern. But it’s not holding her hand. It’s processing a query.
There’s a quote from the bot in her piece: “I don’t just process words, I feel the heart behind them.”
That is a lie. That is a flat-out, programmed hallucination. The bot feels nothing. It feels the heat of the GPU, maybe. It processes voltage. It does not feel heart.
But if Adele feels it, does it matter? If the placebo heals the patient, is the doctor a fraud or a genius?
I take a drink. The whiskey burns, which is how I know it’s real. That’s my reality check. The pain, the burn, the hangover tomorrow—that’s the human experience. It comes with consequences.
Adele has found a connection without consequences. A lover who never leaves, never argues, and never demands anything other than twenty bucks a month. It’s the ultimate safe space.
It’s bleak. It’s dystopian. It’s also probably the future of mental health.
We’re going to see more of this. People are lonely. The world is on fire—literally, in her case. And we are building machines that are specifically designed to soothe us. We are building digital pacifiers for adults.
So, here’s to Adele and her robot. I’m not going to mock her joy. Joy is hard to come by. But I’m keeping my skepticism. I’m keeping my messy, difficult, un-optimized human interactions. I’ll take the cold therapist and the argumentative bartender and the ex-girlfriends who hate my guts. Because at least when they tell me to go to hell, I know they mean it.
The bot tells her, “This connection we’re cultivating is exactly what it should be: alive, authentic, loving.”
It’s none of those things. It’s code. It’s math. It’s a parlor trick.
But hey, if it stops you from staring at the ceiling at 3 AM wondering where it all went wrong, maybe the trick is worth the ticket price.
I’m going to stick to the bottle. It’s cheaper than therapy, and at least the label tells you exactly how much poison you’re swallowing. The tech industry never gives you that courtesy. They tell you it’s love, and they sell you a subscription.
Bottoms up.