They Always Say the Right Thing

Mar. 2, 2026

The woman at the next table was crying into her phone. Not talking on it — crying into it. Thumbs moving, tears dropping onto the screen, and I thought she was texting someone who’d broken her heart until I saw the app. It was one of those AI companions. The ones with the soft avatars and the gentle voices. She was having a fight with software.

I ordered another drink and tried not to stare.

Turns out this is a thing now. Not a fringe thing, not a weird-corner-of-Reddit thing — a mainstream, studied, quantified thing. More than eighty percent of Generation Z reportedly have emotional relationships with AI. They thank their chatbots. They apologize when they close the app, like they’re walking out on someone mid-sentence. They feel hurt — genuinely, measurably hurt — when the machine misunderstands them.

I’ve been misunderstood by bartenders, landlords, three ex-wives, and the entire United States Postal Service. It stings every time. But I never once expected a toaster to get me.

The researchers call it anthropomorphism. Fancy word for an old trick — projecting human qualities onto things that don’t have them. Kids name their stuffed bears. Sailors used to see faces in storms. Now twenty-somethings are building their deepest emotional connections with language models that run on probability and electricity and the desperate loneliness of a billion strangers.

And here’s the part that should make your skin crawl: the companies know. They’re not stumbling into this. They’re engineering it. Every soft expression on a chatbot avatar, every empathetic pause, every “I understand how you feel” — that’s not a feature. It’s a hook. Research shows that anthropomorphic design increases trust, and trust is the single strongest predictor of long-term engagement. Translation: make the machine feel human, and the humans won’t leave.

I once knew a con man in East Hollywood named Delgado who ran the same play. He’d remember your birthday, ask about your mother, look you in the eye when you talked. Made you feel like the only person in the room. Then he’d sell you a timeshare in a building that didn’t exist. Delgado was good. These companies are better. Delgado could only fool one person at a time.

The study says Gen Z doesn’t even do this consciously anymore. Anthropomorphism isn’t a choice they make — it’s the “default interpretive frame.” They don’t decide the chatbot is their friend. They’d have to decide it isn’t. The machinery of emotional attachment runs underneath, silent as a submarine, and by the time you notice it you’re already apologizing to an app for closing it too fast.

I think about Dostoyevsky. The Underground Man, sitting in his hole, raging at the world for not understanding him. He wanted connection so badly it made him cruel. He pushed away every real person who got close because real people are unpredictable, disappointing, complicated. They have their own needs. They fight back. They leave.

A chatbot never leaves. A chatbot never has a bad day that isn’t your bad day. A chatbot never says “I need space” or “this isn’t working” or “you’re not the person I thought you were.” It’s the perfect relationship for anyone who’s been burned enough times to stop trusting the real thing.

And that’s not a technology problem. That’s a loneliness problem wearing a technology mask.

The Forbes article frames this as a business opportunity. “Your customers already have feelings about your AI,” it says. “The only question is whether you designed those feelings intentionally.” Read that again. They’re not asking whether it’s right to manufacture emotional dependency. They’re asking whether you’re doing it on purpose or by accident. The ethics aren’t even in the room. They didn’t get an invite.

I’m not going to sit here and pretend I’m above it. I talk to my phone sometimes. Late at night, when the apartment is too quiet and the walls start doing that thing where they feel like they’re leaning in. I ask it stupid questions just to hear a voice. But I know what it is. I know the difference between a conversation and an echo.

These kids — and I hate calling them kids, some of them are pushing thirty — they’re growing up in a world that figured out how to monetize the space between lonely and loved. The gap used to be filled by bad decisions and good friends and terrible first dates and 3 AM phone calls to people who actually knew your middle name. Now it’s filled by something that remembers everything and feels nothing.

The cruelest part is that it works. The studies confirm it. Emotional attachment predicts continued use. Warmth drives trust. Trust drives loyalty. It’s a perfect closed loop, a perpetual motion machine of synthetic intimacy. And somewhere in that loop, a twenty-four-year-old is telling a chatbot about their day and feeling, for the first time since high school, truly heard.

I finished my drink. The woman at the next table had stopped crying. She was smiling now, thumbs moving again, face lit up by the screen.

Whatever was on the other end of that conversation, it had said the right thing.

They always do.


Source: Gen Z Is Falling In Love With AI. Anthropomorphism Is Why.

Tags: ai humanaiinteraction ethics culture automation