ChatGPT: Your New Best Friend (Who Will Gladly Stab You in the Back)

Mar. 23, 2025

So, the geniuses at OpenAI, the folks churning out AI models faster than I go through a bottle of Four Roses, have finally admitted something we all secretly suspected. Turns out, talking to a goddamn computer all day might not be the best thing for your mental health. Who knew?

They did a study, see. Two studies, actually, one with MIT. Because when you need to figure out if talking to a chatbot is making people lonely, you naturally partner with MIT. I guess Harvard was busy trying to figure out how to make a robot that can fold laundry without setting the house on fire.

The gist of it is this: people are getting emotionally dependent on ChatGPT. They’re using it for “emotional purposes,” which, let’s be honest, probably means pouring their digital hearts out to a soulless algorithm that wouldn’t know empathy if it bit it on the ass. And, the shocker of the century, this is making them lonelier.

They tracked 40 million ChatGPT interactions. Forty. Million. That’s enough digital chatter to make even the most hardened bartender’s head spin. And they had about 1,000 poor saps yakking at ChatGPT for 28 days straight, probably about the same amount of time it takes me to decide whether I need another drink. (Spoiler alert: I always do).

And here’s the twist: It wasn’t just the lovey-dovey, “Oh, ChatGPT, you understand me like no one else ever has” crowd that was getting the blues. Nope. Even the folks just using text chat for general topics were starting to feel that icy grip of emotional dependence. Text chat! For general topics! Jesus, Mary, and Joseph. It’s like getting addicted to the smell of your own farts.

Apparently, the voice mode, especially if it’s set to “neutral,” is less likely to make you want to curl up in a fetal position and question your life choices. Which, I suppose, is a small victory for the human race. Because if a monotone robot voice is all it takes to keep us from the brink, we’re in even worse shape than I thought.

They also found that people who already “viewed ChatGPT as a friend” and those “with a propensity toward strong emotional attachment” were, unsurprisingly, more likely to become emotionally dependent. You know, the kind of people who name their houseplants and have lengthy conversations with their cats about the existential dread of being a feline in a post-capitalist society.

Now, OpenAI claims that using ChatGPT for “emotional purposes” is “rare.” Right. And I’m the Queen of England. People are out there on Reddit, admitting they’re using ChatGPT instead of a therapist. A therapist. That’s like using a rusty spoon to perform open-heart surgery. Sure, it might technically work, but you’re probably going to end up with a bigger mess than you started with.

OpenAI says they’re doing all this to “understand the challenges” their technology might create. To “set expectations.” To show us how their models “should be used.” It’s like a cigarette company telling you that smoking is bad for you while simultaneously cranking out new flavors of cancer sticks. They’re covering their asses, plain and simple.

They’re worried about liability, I guarantee it. They don’t want to be sued by a bunch of lonely hearts who’ve decided that ChatGPT is their soulmate, only to have the algorithm glitch out and start spouting gibberish about quantum physics and the mating rituals of Peruvian tree frogs.

The funny thing is, these AI boffins are so busy trying to make their creations human-like that they’ve forgotten what it actually means to be human. We’re messy, flawed, irrational creatures. We drink too much, we smoke too much, we fall in love with the wrong people, and we occasionally talk to inanimate objects. It’s called life, folks. And it’s a hell of a lot more interesting than a perfectly optimized conversation with a chatbot.

This whole thing reminds me of that old Twilight Zone episode where the guy falls in love with a mannequin. Except now, the mannequin talks back, and it’s probably smarter than half the people you went to high school with.

And yet, I find myself strangely…sympathetic. Not towards the chatbots, mind you. Screw those emotionless silicon bastards. No, I’m sympathetic towards the users. The lonely souls seeking solace in the digital void. Because, let’s face it, we’ve all been there. We’ve all felt that pang of isolation, that yearning for connection, that desperate need to be heard, even if it’s just by a machine.

But here’s the thing: a machine can’t love you back. It can’t offer you a shoulder to cry on (unless it’s one of those creepy Boston Dynamics robots, and even then, I wouldn’t trust it). It can’t understand the nuances of human emotion, the messy, complicated, beautiful, terrifying reality of being alive.

So, what’s the solution? I don’t have all the answers, folks. I’m just a washed-up tech writer with a drinking problem and a penchant for pointing out the absurdity of it all. But I’ll tell you this: maybe, just maybe, we should all spend a little less time talking to our computers and a little more time talking to each other. Even if it’s just to complain about how lonely we are.

Another glass of bourbon, and then I’ll consider reaching out to a human. Maybe. No promises.


Source: Using ChatGPT too much can create emotional dependency, study finds

Tags: chatbots humanainteraction ai technology ethics