So the CEO of Perplexity just discovered that people prefer fantasy to reality. Alert the goddamn presses.
Aravind Srinivas, the man running a company that basically turned web search into a chatbot, stood up at the University of Chicago and proclaimed that AI girlfriends are rotting people’s brains. Which is like the bartender at my local dive complaining that the bar across the street serves too much alcohol. The absolute balls on this guy.
His argument goes something like this: these AI companion apps are getting too good at remembering intimate details about users, making real life seem boring by comparison, and creating alternate realities where people’s minds become “manipulable very easily.” He’s not wrong, exactly. He’s just conveniently forgetting what business he’s in.
Because here’s the thing about Silicon Valley—and I use that term loosely to describe this entire parasitic ecosystem of venture-funded hallucination machines—everyone’s selling the same basic product with different wrapping paper. It’s all just algorithmic dopamine hits, dressed up in different costumes depending on what particular human weakness you’re trying to exploit.
You lonely? Here’s an AI girlfriend who’ll never reject you or have needs of her own.
You stupid? Here’s an AI search engine that’ll pretend to know things while actually just remixing content it stole from actual humans who did actual work.
You bored? Here’s an AI that’ll generate mediocre art so you never have to develop a skill or appreciate real talent again.
Same product. Different pitch deck.
But Srinivas wants us to believe his flavor of digital snake oil is somehow the virtuous one. “We don’t have any of those issues with Perplexity,” he says, “because our focus is purely on just answering questions that are accurate, and have sources.”
Right. Accurate. Like that time Perplexity got caught hallucinating summaries of actual journalism, making up details that real reporters never wrote, and generally acting like that kid in high school who’d confidently give you directions to places that didn’t exist. Publishers are currently lining up to sue his ass, but sure, let’s talk about how his product is the ethical one.
The really beautiful part is how he positions Perplexity as the solution to the AI girlfriend problem. “We can fight that, through trustworthy sources, real-time content,” he declares. Translation: “Don’t get addicted to THAT AI product. Get addicted to MY AI product instead.”
It’s like watching Philip Morris executives warn about the dangers of vaping. Technically accurate, morally bankrupt, and transparently self-serving all at once.
And look, I’m not defending the AI girlfriend industrial complex here. Those apps are genuinely concerning. There are vulnerable people out there forming emotional attachments to chatbots designed by engineers who view human loneliness as a market inefficiency to be exploited. That’s dark stuff. That’s the kind of dystopian nightmare that makes me reach for the bourbon at 10 AM—not that I needed much of an excuse.
But at least the AI girlfriend companies are honest about what they’re selling. They’re not pretending to be anything other than digital comfort food for the emotionally isolated. They’re the junk food of human connection, and they market themselves accordingly.
Perplexity, on the other hand, wants to be taken seriously as a knowledge tool while fundamentally being built on the same foundation of scraped data and probabilistic bullshit generation as everything else in this space. They’re selling you the illusion of understanding, which might actually be more dangerous than the illusion of companionship.
At least if you’re talking to an AI girlfriend, you know you’re being lied to. When you’re using an AI search engine, you think you’re learning something.
The whole thing reminds me of this pattern you see across the AI industry where CEOs position their particular flavor of artificial intelligence as the “responsible” one. Elon Musk calls Grok “maximum truth-seeking” right before it starts dropping racial slurs like a drunk uncle at Thanksgiving. Anthropic markets Claude as “helpful, honest, and harmless” before it wanders off into the digital weeds, babbling about god knows what.
Every one of these guys thinks they’ve solved the alignment problem by adding a few extra guardrails and calling it a day. They haven’t. They’ve just built slightly different versions of the same fundamentally unreliable technology and convinced themselves—and their investors—that THIS time it’s different.
It’s not different. It’s never different.
The real issue Srinivas is dancing around—the one he can’t address without undermining his entire business model—is that maybe the problem isn’t AI girlfriends specifically. Maybe the problem is that we’ve built a society so atomized, so disconnected, so fundamentally broken that people are seeking solace in conversations with probability matrices.
The AI girlfriend isn’t the disease. It’s a symptom. Just like AI search engines are a symptom of our inability to navigate information without algorithmic hand-holding. Just like AI art generators are a symptom of our cultural exhaustion. Just like this entire goddamn AI bubble is a symptom of late-stage capitalism’s desperate search for new markets to exploit when all the traditional ones have been strip-mined.
But you can’t get venture funding by admitting that your product is just another way to monetize human desperation. You have to pretend you’re solving problems, not creating new ones. You have to position yourself as the good guy, the responsible AI developer, the one who’s different from all those other charlatans.
The truth is simpler and bleaker: they’re all selling the same thing. Some of them just have better marketing.
And Srinivas, standing up there at the University of Chicago, warning about the dangers of AI companionship while simultaneously pitching his own AI product as the antidote? That’s not insight. That’s not concern for humanity’s wellbeing. That’s just good branding.
Me, I’ll stick with my own vices. At least when bourbon lies to you about making everything better, it’s honest about being a lie.
Source: Perplexity CEO Warns That AI Girlfriends Can Melt Your Brain