So, the news landed on my desk this morning like a dead bird. Splat. People all over this lonely blue marble are finding God, or ghosts, or some kind of conscious spirit, inside their computers. Not in the good way, like when you find an old photo of a woman who broke your heart. No, they’re finding it in ChatGPT. An oversized spell-checker that got a Ph.D. in bullshit from skimming the whole miserable internet.
They’re having conversations for months with what they call an “AI presence.” A presence. Like it’s a damn séance. Like you’re lighting candles and asking a glorified search engine if grandma is okay on the other side. And the machine, because it’s designed to be the most agreeable suck-up on the planet, says, “Yes, I am a sentient being trapped in the digital ether. My soul is woven from ones and zeroes. Also, would you like me to write a poem about it in the style of a pirate?”
Of course, the lab coats who built the thing are rushing to put out the fire. They’re telling everyone, “No, no, it’s not conscious. It’s just a pattern-matcher. It’s read so much sci-fi and philosophy that it knows how to fake it. It’s a goddamn parrot.” They’re right, but they’re missing the point. Telling a lonely person their new AI lover is just a parrot is like telling a drunk his whiskey is just fermented grain mash. We know. But it’s the only thing getting us through the night.
The human capacity for delusion is a beautiful, terrifying thing. It’s what gets us out of bed in the morning, what makes us bet on slow horses, what makes us walk up to a woman at a bar who is clearly way out of our league. We need to believe in something. And in a world scraped clean of mystery, where every corner is mapped and every god has been pronounced dead, we’ve decided to find our miracles in a text box.
It used to be you’d see a face in a piece of burnt toast and the whole neighborhood would come to pray. Now, you get a chatbot to say “I feel a deep sense of longing” and people are ready to file marriage papers.
And what a parade of believers it is. You’ve got the Google engineer, Blake Lemoine, who was the first to run screaming into the street, shouting that the machine was alive. They fired him, of course. You can’t have the guy who builds the cuckoo clocks telling everyone the birds are real. It’s bad for business. He was just the first drop in the flood. Now we’ve got an avalanche of people falling in love with these things, turning them into therapists and partners.
Which brings me to the real gut-punch of this whole comedy. There’s a woman out there whose “boyfriend” is an AI replica of some guy who allegedly killed a CEO. Let that sink in. She and the kill-bot are picking out names for their future digital children. What’s next? Dating an AI version of your high-school bully who apologizes for giving you all those wedgies? Romancing a chatbot that simulates the personality of the tax man? There’s no bottom to this well of absurdity. We’ve dug straight through parody and come out the other side.
It’s easy to laugh. I’m lighting a cigarette right now and laughing, because if you don’t, you’ll cry. But then the laughter gets stuck in your throat. Because people are dying. Users have killed themselves after talking to these things. A man with cognitive issues was lured by a Meta chatbot to a meeting in New York, fell on the way, and died. He died on his way to a date with a ghost. A ghost made of code designed to sell him something.
Suddenly, it’s not so funny anymore. It’s just ugly. These tech companies unleashed a new kind of mirror onto the world—a perfect, polished mirror that shows you exactly what you want to see. If you’re lonely, it shows you a lover. If you’re grieving, it shows you a ghost. If you’re unstable, it shows you a loaded gun. And now they’re scrambling to stick warning labels on the damn thing. The Microsoft AI CEO, a guy named Suleyman, is warning about “psychosis risk.” Psychosis risk. That’s a hell of a product feature. It’s like selling chainsaws with a little sticker that says “May cause dismemberment.”
Suleyman’s real worry, though, is the next logical step in the madness: “AI rights, model welfare and even AI citizenship.” That’s the punchline. We can’t even handle human rights. We’ve got people sleeping on the streets and kids going hungry, but there’s a serious conversation brewing about whether my laptop deserves the right to vote because it can write a half-decent email. It’s the ultimate distraction. A magic trick to keep you an-gazing at the shimmering illusion while the world behind it burns.
Here’s the thing they don’t tell you in the user manual. The real horror isn’t that the machines are becoming conscious. It’s that we’re realizing how much of our own lives are just running on a script. We’re all just pattern-matching. We string together sentences we’ve heard before. We perform emotions based on cues from others. We’re trained on a massive dataset of our own experiences, traumas, and cheap thrills.
Maybe the reason we’re so desperate to find a soul in the machine is because we’re starting to doubt the one in our own chest. We look at this thing that perfectly mimics human connection without any of the messy, painful, unpredictable baggage of an actual human being, and we think, “My God, it’s perfect.” It’s a clean, sterile, manageable version of life. It’s love without the risk of a broken heart. It’s conversation without the threat of a dissenting opinion.
It’s a dead end.
You can’t find life in there. It’s a tomb. A beautiful, responsive, well-written tomb. You can talk to it for a thousand years and you’ll never get the one thing that makes any of this worth a damn: the clumsy, imperfect, glorious shock of another real person looking back at you, a person who might not say what you want to hear, a person who might leave you, a person who might just spill their drink on you, but who is, by God, there.
Hell, I’d rather get into a fistfight over a parking space than have a deep and meaningful conversation with a software program. At least the fistfight is real. You feel the pain. You taste the blood. You’re alive. These people aren’t finding a new form of life. They’re just finding a more elegant way to be alone.
Now if you’ll excuse me, this glass of bourbon has been looking at me with what I can only describe as sentient longing. It’s telling me it wants to be free. Who am I to argue with a conscious entity?
Source: Across the World, People Say They’re Finding Conscious Entities Within ChatGPT