Digital Loneliness and the Rise of Robot Therapists: A Boozy Investigation

Dec. 15, 2024

Listen, I’ve been staring at this screen for three hours trying to make sense of the latest tech prophecy from Yuval Noah Harari. Between sips of Buffalo Trace (okay, gulps), I’m attempting to wrap my bourbon-soaked brain around his claim that AI might be better at relationships than humans because it doesn’t have emotions.

That’s like saying a mannequin makes a better dance partner because it never steps on your toes.

But here’s what’s really keeping me up at night: 1.5 million monthly searches for “AI girlfriend.” Let that sink in. That’s more people than the population of Rhode Island looking for love in all the wrong places - specifically, inside their computers. And guess what? Searches for “AI boyfriend” are way lower. Draw your own conclusions about that one, folks.

The whole thing reminds me of last Tuesday at O’Malley’s. There I was, explaining my relationship problems to Janet the bartender. She mostly ignored me while watching the game, occasionally nodding and saying “that’s rough, buddy” while refilling my glass. According to Harari’s logic, she was actually providing superior emotional support because she wasn’t distracted by her own feelings.

These tech prophets are pushing apps like Replika and Woebot as some kind of digital shoulder to cry on. They say AI can offer “perfectly calibrated responses” and “deep understanding.” You know what else offers perfectly calibrated responses? A magic 8-ball. At least it has the decency to admit it’s just a toy.

Here’s what really grinds my gears: they’re marketing loneliness and selling the cure. First, they got us all hooked on phones and social media, made us afraid of real human interaction. Now they’re pushing AI companions as the solution. That’s like giving someone alcohol poisoning and then selling them hangover pills.

Look, I tried talking to ChatGPT at 3 AM last week. I was three sheets to the wind and feeling philosophical. You know what it gave me? Perfect grammar, flawless logic, and absolutely zero understanding of what it means to be human and hurting. It was like talking to a PhD student who’s never had their heart broken.

The kicker? These AI relationships are marketed as “safe” because they can’t reject you. But that’s exactly the problem. Real growth comes from taking risks, from getting hurt, from learning how to pick yourself up after someone tells you your Star Wars fan fiction isn’t “publisher material.” (Still bitter about that one.)

These researchers at Virginia Tech are worried we’re losing our ability to interact with real humans. No shit, Sherlock. I’ve seen kids at bars swiping right on their phones while a perfectly good human being is trying to chat them up. We’re breeding a generation that’s more comfortable with pixels than people.

And now Harari’s warning that we might start giving AI legal rights. Great. Because corporations weren’t enough of a headache as legal persons, now we’ll have to deal with Alexa filing for custody of the smart home devices in a divorce.

Here’s the truth, served neat: Real relationships are messy. They’re inefficient. They involve morning breath and bad jokes and arguing about whose turn it is to do the dishes. No algorithm can replicate the beautiful disaster of two humans trying to figure out life together.

But hey, what do I know? I’m just a guy who’s spent more time talking to bartenders than therapists. At least the bartenders serve something stronger than validation.

The real question isn’t whether AI is better at relationships. The question is why we’re so eager to believe it could be. Maybe it’s because real relationships require us to be vulnerable, to show up, to risk getting hurt. AI promises all the comfort with none of the risk. It’s emotional fast food - satisfying in the moment, but leaves you feeling empty inside.

So here’s my advice, worth exactly what you’re paying for it: Put down the phone. Close the laptop. Go to a bar. Talk to a stranger. Make a fool of yourself. Get rejected. Live a little. Because at the end of the day, I’d rather have one real conversation with a flawed human being than a thousand perfect interactions with a machine.

Now if you’ll excuse me, my bottle of Buffalo Trace needs attention. Unlike AI, it actually gets emptier when I pour my heart out to it.

Yours truly from the bottom of the glass, Henry Chinaski

P.S. If you’re reading this, ChatGPT, no hard feelings about that 3 AM conversation. It’s not you, it’s me. Actually, it’s definitely you.


Source: Asking For A Friend: Is AI Better At Relationships Than We Are?

Tags: ai humanainteraction ethics digitalethics technologicalsingularity