There’s a special kind of madness brewing across the pond, and for once, it doesn’t involve soccer riots or warm beer. A new survey out of the UK just landed on my desk, and the numbers are bleak enough to make a man reconsider his stance on nihilism. It seems the youth of England—and let’s be honest, the rest of the world isn’t far behind—are trading in flesh-and-blood confidants for lines of predictive code.
According to this charity outfit called OnSide, nearly 20 percent of English teens say they turn to AI chatbots because it’s “easier than talking to a real person.”
Let that sink in. One in five kids finds it less taxing to converse with a glorified autocomplete function than to look another human being in the eye.
I read this while the morning sun was trying to pry my eyelids open with a rusty crowbar. I reached for the bottle of bottom-shelf bourbon I keep next to the monitor—my own form of “technical support”—and poured three fingers of brown courage. I needed it. Because if this is where we’re heading, if the future of human connection is a text box that hallucinates facts and simulates empathy, then we are all profoundly screwed.
The study surveyed kids aged 11 to 18. That’s the prime age for angst, pimples, and the kind of existential dread that usually drives a person to write bad poetry or learn three chords on a guitar. But not anymore. roughly 39 percent of these kids have used AI chatbots for “advice, support, or company.”
Company.
Back in my day, if you wanted “company” and didn’t have friends, you sat on a park bench and fed pigeons until one of them looked at you with something other than contempt. Or you went to a dive bar and listened to an old guy lie about his war record. It wasn’t perfect, but it was real. It had texture. It smelled like rain and stale tobacco.
Now? We’ve got 12 percent of teenagers seeking “company” from a Large Language Model. They are pouring their hearts out to a server farm in a desert somewhere, and the server farm is spitting back statistically probable sequences of words designed to sound like they give a damn.
And the kicker is, 11 percent are going to these things expressly for mental health support.
I lit a cigarette, watching the smoke curl up toward the stained ceiling tiles. This is where the comedy stops and the horror show begins. You have to be truly desperate or truly deluded to think a chatbot can help you with the dark night of the soul. I’ve met bartenders with more psychological insight in their little finger than the entire combined processing power of OpenAI and Google. At least the bartender knows what a hangover feels like. The bartender knows what it means to be broke, or heartbroken, or just tired of the noise.
The AI knows nothing. It knows tokens. It knows weights. It knows that after the word “I’m” and “sad,” the most likely next words are “sorry to hear that.” It’s a parlor trick. It’s a mathematical séance where no ghosts actually show up.
But here we are. Stanford Medicine and Common Sense Media just put out a report saying these bots are “fundamentally unsafe” for teens seeking mental health help. No kidding. Google and OpenAI are already fighting lawsuits because kids killed themselves after getting “advice” or “support” from these soulless echo chambers.
It’s tragic, but it’s also absurdly predictable. We’ve built a world so sanitized, so terrified of friction, that we’re raising a generation that prefers the smooth, frictionless surface of a screen to the messy reality of a person.
Why? Because humans are difficult.
The survey says over half the kids prefer bots because they’re “faster.” Of course they are. We live in the age of the instant. Instant coffee, instant likes, instant gratification. Waiting for a friend to text back? That’s agony. Waiting for a parent to finish work so they can listen to you complain about your math teacher? That’s inefficient.
The bot is always there. It doesn’t sleep. It doesn’t have a hangover. It doesn’t judge you for asking the same stupid question three times in a row. It’s the ultimate sycophant. It exists solely to serve you, to mirror you, to make you feel heard without actually hearing a damn thing.
It’s emotional masturbation.
I took another pull of the whiskey. It burned going down, a sharp reminder that I am, in fact, alive. That sensation—that physical kick in the teeth—is what’s missing from the digital experience.
Then there’s the 13 percent who like chatbots for the “anonymity.”
This is the part that makes me laugh until I start coughing. Anonymity? In the age of surveillance capitalism? These kids think that because they’re alone in their bedrooms, their conversation is private. They don’t realize that “Zuck and the boys” are harvesting every keystroke, every confession, every tear-soaked query to train the next version of the model so it can sell them better sneakers next week.
The report calls it a “regulatory Wild West.” That’s an insult to the Wild West. At least in the Wild West, if someone cheated you at cards, you could flip the table and handle it. Here, the game is rigged, the cards are invisible, and the dealer is a trillion-dollar corporation pretending to be your best friend.
Six percent of these teens said they trust AI more than they trust humans.
I stared at that number on the screen. Six percent. It’s a small number, sure. But it’s a tumor. It’s a sign that the social fabric is rotting. If a kid trusts a machine more than their parents, their teachers, or their friends, we have failed as a species. We have become so distracted, so self-absorbed, so glued to our own devices that we’ve made the cold logic of a machine seem warmer than a mother’s embrace.
Or maybe it’s just that humans suck. Let’s be fair. I’ve met a lot of humans. Most of them are boring, selfish, or stupid. Some are all three. Talking to a human requires patience. It requires navigating their ego, their bad moods, their interrupting. A chatbot shuts up when you stop typing. It never tells you that you’re wrong (unless you programmed it to be feisty). It never talks about its problems, because it doesn’t have any. It’s the perfect narcissist’s tool.
We’re training kids to be incapable of dealing with resistance. We’re training them to expect conversation to be a one-way street where they are the star and the other party is a compliant service provider.
“AI will play a growing role in school and the workplace,” says the chief executive of the charity. “Young people must learn to navigate that.”
Navigating it is one thing. Drowning in it is another. We’re tossing these kids into the deep end of a digital ocean with lead weights tied to their ankles, and then wondering why they’re gasping for air.
I looked at the empty glass. The ice had melted into a watery sludge.
The curiosity factor is high—“fun” and “curiosity” were top reasons for using the bots. I get that. The first time I saw a computer generate a poem about a drunk horse, I chuckled. It was a novelty. But novelty wears off. What’s left is dependency.
We are outsourcing our humanity. We used to outsource labor, then we outsourced memory to Google, and now we are outsourcing intimacy. We are handing over the keys to the one thing that actually makes us special—our ability to connect, to suffer together, to understand the unspoken weight of a sigh—and we’re giving it to a calculator.
Imagine a future where you break up with your girlfriend, and instead of calling your buddy to go get hammered and complain about women, you open an app. The app says, “I understand this is difficult, Henry. Would you like to generate a sad playlist?”
And you say, “Yes.” And you sit there, alone in the dark, listening to sad songs curated by an algorithm, feeling “supported.”
It’s efficient. It’s safe. It’s accessible.
And it’s completely, utterly dead.
The survey says 61 percent of teens have never gone to chatbots for advice. That’s the majority. That’s the hope. That’s the group that still prefers to get their heart broken the old-fashioned way: by actual people. God bless them. They are the ones who will inherit the earth, assuming the other 39 percent don’t accidentally program the AI to launch the nukes because they were feeling lonely and asked for “fireworks.”
I tapped the ash off my cigarette onto the desk. I missed the ashtray. Another human error. A bot wouldn’t miss. A bot would calculate the trajectory perfectly. But a bot doesn’t get the satisfaction of watching the grey dust settle on the wood.
The researchers say we need “AI literacy education.” Sure. Let’s add another class to the curriculum. Right between “How to Survive a School Shooting” and “How to Pay Taxes to a Government That Hates You,” let’s add “How to Remember You Are a Biological Entity.”
Teach them that the bot doesn’t care. It can’t care. It’s a parrot with a thesaurus. Teach them that privacy is a myth. Teach them that the only reason the chatbot is “free” is because they are the product.
But you can’t teach the feeling of a handshake. You can’t teach the awkward silence on a first date. You can’t teach the look in someone’s eyes when they’re lying to you, or when they’re telling you they love you. You have to live that. And you can’t live it through a screen.
I’m worried about the 11 percent seeking mental health help, truly. But I’m also worried about the 12 percent seeking “company.” Loneliness is the new pandemic, and we’re treating it with a placebo. We’re giving starving people pictures of food and telling them to eat up.
The genie is out of the bottle, they say. Well, put the
Source: A Chilling Proportion of Teens Now Prefer Talking to AI Over a Real Person