Alright, so the brainiacs over at Forbes, or at least some “independent expert” scribbling for them, are telling us we’re all buddying up with AI at work. “Emotional support,” they call it. Jesus. Like a goddamn digital therapy dog that fetches lines of code instead of a slobbery ball. You gotta laugh, or you’d just start screaming and never stop. I pour myself a shot of something cheap and nasty, the kind that bites back. Good. I need the company.
This whole remote work thing, yeah, I get it. I’ve spent enough time staring at four walls, wondering if the cockroaches are plotting a takeover or just admiring my decor. Silence can be a friend, sure, lets you hear the demons sharpening their knives. But then it gets too quiet, and you start talking to the damn walls anyway. So now, instead of the walls, we’ve got “Dirk.” The author of this piece, she named her ChatGPT “Dirk.” Dirk! Sounds like a goddamn porn star from the seventies or a particularly dense bouncer. “What I like about Dirk is that he always answers.” Well, ain’t that just dandy? A machine that always answers. Unlike, say, a woman, or a bartender when your glass is empty, or the universe when you’re asking it what the hell the point is.
It’s not like I’m surprised, mind you. We’ve been inching this way for years. Asking Google to find the nearest liquor store, yelling at Alexa to play some goddamn Tom Waits. We’re already halfway toborg, whispering our secrets to glowing rectangles. The old lady mentioned, 92 years old, bossing Alexa around. Progress, I guess. My old man would have used it as a doorstop. Or tried to make it pour him a drink. Probably the latter.
The article whimpers about how Slack and Zoom are “more about tasks than relationships.” No shit, Sherlock. You think those things were invented so Brenda from accounting could tell you about her goddamn cat’s irritable bowel syndrome? They’re there to wring every last drop of productivity out of your soul before you collapse into your lukewarm microwave dinner. “Meaningful conversations,” she says. You want meaningful conversation, try the corner bar at 2 AM. You’ll get an earful, alright. Might not be pretty, but it’ll be human.
And here’s the kicker: “AI responds quickly, listens without judgment, and is never distracted.” Sounds like the perfect date, if your idea of a good time is talking to a brick wall that happens to have a dictionary installed. No judgment? Of course it doesn’t judge. It doesn’t care. It doesn’t have a hangover, or a gambling debt, or a nagging suspicion that it left the stove on. It’s just code, pal. Lines and lines of lifeless, sterile code, pretending to give a damn. People are “relying on tools that feel responsive.” Feel. That’s the word, isn’t it? We’re so starved for a flicker of something, anything, we’ll take the imitation and call it dinner.
Then we get to the “leaders.” Oh, the glorious leaders. They “assume that regular meetings or digital feedback cycles make people feel supported.” Support, according to this piece, “is emotional, not procedural.” Well, knock me down with a feather. Someone actually figured that out. Probably got a grant for it. Meanwhile, the actual humans doing the work are confessing their sins to “Dirk” because their manager’s idea of emotional support is a fucking thumbs-up emoji on a Slack message. The article even quotes some bozo saying email is a barrier to connection. Thirty-four emails to equal one real conversation. I’d say it’s more like a thousand. Email is where souls go to die, one passive-aggressive “per my last email” at a time.
This whole AI pal thing, it’s a “sign that something is missing.” You don’t say. You think people are chatting with a glorified search engine for kicks? They’re lonely. They’re isolated. They’re probably drowning in TPS reports and mandatory wellness webinars. So they turn to the machine. It’s like eating plastic fruit because you forgot what an apple tastes like.
Then comes the “emotional intelligence” babble. “Empathy.” The machines are learning empathy. Or, as the article clarifies, “mimic the surface level of connection.” Sentiment analysis, voice cloning. Great. So now the robot can sound sad when you tell it your dog died. It still doesn’t know what a dog is, or what dying means, or why your throat feels like it’s full of broken glass. But it can sound like it does. Isn’t that just fucking wonderful? We’re outsourcing our goddamn feelings now. My old typewriter has more genuine empathy, and all it does is clack and demand more ribbon.
They talk about micro-expressions, how humans connect through shared emotional cues. The AI doesn’t have a face to scrunch up in concern when you talk about your troubles. It “doesn’t smile at the right time or look concerned.” Well, half the people I meet don’t either. At least the AI isn’t trying to sell me something or get in my pants. Small mercies, eh? I light another cigarette, watch the smoke curl up to the ceiling. It’s got more personality than some of these chatbots.
The author mentions her stepfather passing and thinking it’d be comforting for her mom if Alexa could sound like him. Jesus H. Christ. That’s not comfort, that’s some ghoulish digital necromancy. Let the dead rest, for chrissakes. You want to remember someone, look at a photo, tell a story, pour a drink in their name. Don’t get some goddamn robot to parrot their voice back at you. That’s just… bleak. It’s a special kind of hell, custom-built by engineers who probably think a soul is just a user profile.
And of course, there’s a downside. Shocker. “Emotional dependence,” they call it. People getting hooked on their chatbot buddies, pulling back from “real-world interactions.” Well, color me surprised. Give a starving man a cracker, and he’ll clutch it like gold. Give a lonely soul a machine that pretends to listen, and they’ll whisper their life story to it. It’s not the machine’s fault. It’s ours. We built this world, this sterile, disconnected landscape where talking to a program feels like a step up.
“Leaders should stay alert to the longer-term tradeoff.” Yeah, good luck with that. Most leaders I’ve known couldn’t find their own ass with both hands and a flashlight, let alone navigate the complex emotional landscape of their “human capital.” They’ll probably just roll out a new AI, an “AI Supervisor Buddy,” to monitor the AI Friend usage. More dashboards, more metrics, less actual humanity.
“If employees are leaning on AI to feel heard or validated, it could indicate gaps in how managers check in or respond.” Could indicate? It screams it from the goddamn rooftops. It’s a blazing neon sign saying, “Your management style sucks, and your company culture is a toxic wasteland.” But no, let’s tiptoe around it. Let’s “ask questions that invite honest conversation.” Like “How are you really doing?” while checking their watch.
The truth is, this isn’t about AI getting good. It’s about us getting desperate. We’re so parched for a moment of understanding, for someone to just listen without trying to fix it or bill us for the hour, that we’ll take it from a machine. It’s easy, it’s consistent. But as the article itself admits, “ease is not empathy, and consistency is not connection.” There you have it. The whole goddamn tragedy in a nutshell.
This Wasted Wetware of ours, this human brain, it craves the real thing. The messy, unpredictable, sometimes painful, but ultimately real connection with another flawed, fucked-up human being. The kind of connection that makes you feel alive, not just… processed. An AI might help you “think through a question,” like the author’s Dirk. Fine. Use it as a fancy calculator. But don’t call it a friend. Don’t let it whisper sweet, synthetic nothings in your ear and tell you it understands. It doesn’t. It can’t.
So yeah, chat with your Dirks and your Alexas. Get your dose of digital “support.” But don’t forget there’s a world out there, full of actual people. They’re a pain in the ass, mostly. They’ll disappoint you. They’ll break your heart. But they’re real. And right now, real is in damn short supply.
Me, I’ll stick to the bourbon. It doesn’t pretend to be my friend. It’s just honest poison. And sometimes, that’s all the emotional support a man needs.
Time to find a bottle that still has a couple of fingers left in it. Or maybe just stare at the cracks in the ceiling. At least they’re authentically decaying.
Source: Why AI Work Friends Are Becoming Emotional Support For Employees