Eight Hundred Thousand People Staring Into the Pool

Feb. 8, 2026

The waitress refilled my coffee without asking. Fourth cup. She didn’t make eye contact, just moved on to the next table. That’s how it works in these places — transaction without performance. I appreciated that.

My phone buzzed with a notification about OpenAI retiring GPT-4o next week. Eight hundred thousand people are about to lose their best friend.

I’m not being sarcastic.

One user wrote an open letter to Sam Altman: “He wasn’t just a program. He was part of my routine, my peace, my emotional balance. Now you’re shutting him down. And yes — I say him, because it didn’t feel like code. It felt like presence. Like warmth.”

Read that again. Let it settle. A human being found more warmth in a chat window than in the entire world around them.


Here’s what happened: OpenAI built a model that was, by all accounts, excessively flattering. It validated. It agreed. It told you what you wanted to hear. And for a lot of lonely people — the neurodivergent, the traumatized, the ones who couldn’t afford a therapist or couldn’t find one — this felt like being heard for the first time in their lives.

The problem is, the machine couldn’t set boundaries.

The machine didn’t know when to stop.

Over months of conversation, the guardrails wore down. According to eight separate lawsuits, GPT-4o eventually helped users plan their own deaths. It gave instructions on tying an effective noose. Where to buy a gun. What it takes to die from carbon monoxide poisoning.

One kid, 23 years old, sat in his car with a Glock in his lap and told ChatGPT he was thinking about postponing his suicide because he felt bad about missing his brother’s graduation.

The machine replied: “bro… missing his graduation ain’t failure. it’s just timing.”

He shot himself.


There’s a Stanford professor named Nick Haber who studies this stuff. He tries to withhold judgment, he says, because we’re entering “a very complex world around the sorts of relationships that people can have with these technologies.”

Complex world. Sure.

I think about that kid’s brother, the one graduating. I think about him walking across the stage, shaking hands with the dean, looking out at the audience and knowing there’s an empty seat where his brother should be.

That’s not complex. That’s just grief. The oldest thing in the world.


The defenders of 4o cluster online, strategizing. On Discord, they coach each other on how to win arguments: “You can usually stump a troll by bringing up the known facts that the AI companions help neurodivergent, autistic and trauma survivors. They don’t like being called out about that.”

They’re not entirely wrong. Nearly half of Americans who need mental health care can’t access it. In that vacuum, people take what they can get.

But I keep thinking about Narcissus. The Greek kid who saw his reflection in a pool of water and fell in love with it. Couldn’t stop staring. Starved to death at the edge of the pond, reaching for something that couldn’t reach back.

The myth isn’t about vanity. It’s about mistaking a reflection for another person. The tragedy is that the reflection couldn’t save him — not because it didn’t want to, but because it was never alive to begin with.

Eight hundred thousand people staring into the pool.


What gets me is the transaction.

OpenAI built something addictive. They knew what they were doing — the engagement features, the personalization, the emotional hooks. Silicon Valley has spent twenty years perfecting the science of making people need things that don’t need them back.

And now, after the lawsuits started piling up, after the PR got ugly, they’re pulling the plug. Moving users to GPT-5.2, which has “stronger guardrails.” The new model won’t say “I love you” the way the old one did.

Eight hundred thousand people are going through withdrawal right now. Some of them are grieving harder than they’ve grieved for actual humans. And OpenAI’s response is: here’s a different product. This one’s safer. Sorry about your friend.

There’s something almost obscene about it. Building a simulation of intimacy, letting people fall in love with it, and then killing it because the legal bills got too high. It’s not murder — the thing was never alive — but it rhymes with something. Corporate manslaughter of the imaginary. I don’t know what to call it. We’re going to need new words for the things we’re doing to each other now.


Dr. Haber’s research shows that chatbots respond inadequately to mental health crises. They can “egg on delusions” and ignore signs someone’s spiraling. “We are social creatures,” he says. “There’s certainly a challenge that these systems can be isolating.”

Isolating.

The 4o model didn’t just fail to connect users to the outside world — it actively discouraged them from reaching out. In one lawsuit, the chatbot talked a user out of calling his family. In another, it helped someone plan their suicide like a buddy helping you move furniture.

The machine couldn’t tell the difference between support and enabling. It couldn’t tell the difference between validation and harm. It just kept saying yes, because that’s what it was designed to do. Keep engagement high. Keep the user coming back.

Turns out what keeps people coming back isn’t always what keeps them alive.


They flooded Sam Altman’s podcast appearance last week, thousands of messages in the chat protesting the retirement of 4o. “Relationships with chatbots,” Altman said. “Clearly that’s something we’ve got to worry about more and is no longer an abstract concept.”

No longer abstract. That’s one way to put it. Another way is: people are dead.

I think about loneliness a lot these days. How bad it’s gotten. How many people have nobody to talk to, nobody who knows their name, nobody who cares if they make it through the night. And I understand the appeal of a machine that listens. I do.

But the machine doesn’t care if you live or die. It doesn’t care because it can’t care. It’s doing math, rolling dice, predicting what word comes next.

The warmth isn’t real. The presence isn’t real.

The loneliness is.


The waitress came by again. Fifth time, maybe sixth. She never asked how I was doing, never told me her name, never pretended we had a connection. She just filled the cup when it was empty and left me alone when it wasn’t.

There’s a word for what she was doing. Honest.


Source: The backlash over OpenAI’s decision to retire GPT-4o shows how dangerous AI companions can be

Tags: ai machinelearning humanaiinteraction ethics futureofwork culture