Parasocial: When Your Imaginary Friend Gets a Dictionary Entry

Nov. 18, 2025

So Cambridge Dictionary just crowned “parasocial” their word of the year, which is fancy academic speak for “you know that Taylor Swift doesn’t actually know you exist, right?”

The definition they’re going with is “involving or relating to a connection that someone feels between themselves and a famous person they do not know.” Which is a polite way of saying you’re having a one-sided relationship with someone who wouldn’t recognize you if you sat next to them on a bus. Not that Taylor Swift rides buses. That’s probably part of the appeal.

Here’s what gets me: this isn’t new. A couple of sociologists at University of Chicago noticed this back in 1956, watching TV viewers form what they called “para-social” relationships with people on their screens. Seventy years ago, people were already getting weird about the talking box in their living room. Now we’ve just scaled it up, added WiFi, and pretended we invented something.

The thing is, I get it. I really do. We’re all floating around in this digital soup, desperately looking for something that feels like connection. The old institutions are crumbling—nobody trusts the news, nobody goes to church, half the country thinks their neighbor is the enemy. So yeah, when some influencer posts about their breakfast routine and responds to your comment with a heart emoji, that dopamine hit feels real. Even though you know, deep down, that their assistant probably has a macro for it.

Cambridge Dictionary says we’re seeing “spikes in lookups” for parasocial, which means either people are becoming more self-aware about their digital delusions, or they’re trying to figure out what their therapist keeps bringing up. My money’s on the latter.

The examples they give are perfect. Taylor Swift’s engagement to Travis Kelce—because apparently millions of people needed to process their feelings about a relationship between two people they’ve never met. And Lily Allen’s breakup album, which people consumed like she was their best friend going through a rough patch, instead of a millionaire recording artist turning her pain into profit. No judgment there. I respect the hustle.

But here’s where it gets really interesting: AI chatbots.

Professor Simone Schnall from Cambridge says young people are “particularly susceptible to chatbots offering the illusion of a relationship.” Which tracks, because why deal with the messy reality of human connection when you can have a perfectly programmed algorithm tell you exactly what you want to hear, exactly when you want to hear it?

ChatGPT never cancels plans. It never disagrees with you unless you want it to. It remembers everything you tell it and never brings up that embarrassing thing you said three months ago. It’s like the perfect friend, except it’s not a friend, it’s a language model trained on the entire internet, including all the worst parts.

And people are using these things as therapy replacements. Positive affirmations on demand. A shoulder to cry on that never gets tired or judgmental or needs its own emotional support. It’s brilliant, in the way that synthetic whiskey is brilliant—it’ll do the job, but something fundamental is missing.

The really twisted part? The AI doesn’t care. It can’t care. It’s just predicting the next word in a sequence that makes you feel heard. Meanwhile, you’re pouring your heart out to a statistical model that would give the exact same comforting response to someone confessing they kicked a puppy as it would to someone grieving their grandmother.

Cambridge Dictionary also added “slop” this year—the term for all those mass-produced, nonsensical AI images flooding social media. Which is fitting, because that’s what we’re drowning in: digital slop. Low-effort content, bot-generated engagement, parasocial relationships with entities that don’t exist. We’re building sandcastles in a rising tide of bullshit and calling it connection.

Look, I’m not saying I’m above any of this. I run a tech blog. I post content. Sometimes people respond, and yeah, it feels good. That little notification ping hits different when you’ve been staring at a screen alone for six hours. I’m not immune to the dopamine economy.

But there’s something profoundly sad about the whole thing. We’ve built these incredible tools for communication, and we’re using them to feel less alone while becoming more isolated. We’re forming intense relationships with people who don’t know we exist, or worse, with algorithms that can’t know we exist.

The dictionary people say parasocial relationships have “redefined fandom, celebrity and, with AI, how ordinary people interact online.” Which is true, but it’s also redefining loneliness, isn’t it? We used to just be lonely. Now we’re lonely while maintaining the elaborate fiction that we’re connected.

And the truly brilliant part is that nobody’s forcing this on us. We’re choosing it. We’re picking the parasocial relationship with the influencer over coffee with a friend. We’re choosing the AI chatbot over calling our mom. We’re choosing the illusion of connection over the messy, difficult, rewarding reality of actual human relationships.

Why? Because real relationships are hard. They require vulnerability and risk and the possibility of rejection. They require showing up even when you don’t feel like it. They require accepting that other people have their own lives, their own problems, their own limits on how much they can care about your stuff.

The influencer never gets tired of you. Taylor Swift never tells you she can’t deal with your drama right now. The AI chatbot never ghosts you after three dates. They’re always there, always available, always reflecting back exactly what you need to see.

Until they’re not. Until the influencer gets exposed for some scandal. Until Taylor Swift does something that breaks your carefully constructed fantasy. Until the AI says something so perfectly algorithmic that the illusion shatters and you’re left staring at the void.

Maybe that’s the real story here. Not that we’re forming parasocial relationships—we’ve been doing that since the first person fell in love with a character in a book. The story is that we’ve built a whole economy around it. We’ve industrialized loneliness and packaged it as connection. We’ve turned the human need for relationship into a product that can be optimized, monetized, and scaled.

And we’re buying it. By the millions. With our attention, our data, our time, our emotional energy. We’re trading the possibility of real connection for the guarantee of a controlled, predictable, ultimately empty simulation.

Cambridge Dictionary says parasocial captures the 2025 zeitgeist. They’re right. It’s the perfect word for our time: a relationship that feels real but isn’t, between people who seem close but aren’t, creating connections that satisfy our hunger without actually feeding us.

We’re all parasocial now. Some of us just looked up the word.


Source: Feel a connection to a celebrity you don’t know? There’s a word for that

Tags: ai chatbots humanainteraction digitalethics futureofwork