Your Soul, Autocorrected

Aug. 9, 2025

So I’m sitting here, the ghosts of last night’s bourbon rattling around in my skull, and I read this thing. It says we’re all starting to sound like the same goddamn robot because we’re swimming in the digital slop it churns out. Some sharp-eyed lab coats at a place called the Max Planck Institute noticed that we’re all suddenly using words like “delve” and “meticulous.”

Delve.

I had to pour another drink just to get the taste of that word out of my mouth. It sounds like something a junior marketing associate says when he wants to sound smart in a meeting before he gets fired for incompetence. “Let’s delve into these third-quarter synergy metrics.” It’s a word with no blood in it. No guts. It’s a clean word for a dirty world, and now, apparently, it’s seeping into our brains like a slow gas leak.

The article calls it a “linguistic watermark,” a neon sign flashing ChatGPT was here. My watermark is the stain on the bar, the ash on the floor, the memory of a woman’s laugh that was half-angel, half-devil. That’s a watermark. “Delve” is just
 corporate paint-by-numbers.

And this is what they’re selling us as progress. A great flattening. They’re taking the beautiful, chaotic, fucked-up symphony of human speech—with all its stumbles, its slang, its regional grit, the way a guy from Brooklyn says “cawfee” or a woman from the South stretches a vowel into a three-act play—and they’re running it through a digital rock tumbler. They’re sanding off the edges, polishing away the character, until all that’s left is a handful of smooth, identical, useless pebbles.

We’re all supposed to talk like we’re writing a cover letter for a job we don’t want. Muted emotional expression, the report says. Of course. Emotion is messy. Emotion gets you in trouble. Emotion leads to bad checks, fistfights, and waking up next to someone whose name you can’t remember. It also leads to poetry, to art, to anything that’s ever mattered. But the machine doesn’t like messy. The machine wants clean, predictable inputs. It wants us to be good little data points, speaking in unison from the approved vocabulary list.

The real gut-punch in this whole thing is the part about trust. Some other professor, this one from Cornell, points out the beautiful, ugly paradox of it all. The machines can help us write more “positive” and “cooperative” messages. Isn’t that nice? Your phone can now apologize to your wife for you. It can select the optimal words to smooth things over after you drank the rent money.

But here’s the twist that makes you want to either laugh or burn the whole world down: if the person on the other end suspects you’re using a machine to do your talking, they trust you less. It’s not the AI itself that kills the connection; it’s the smell of it. It’s the uncanny valley of the soul. You get a message that’s a little too perfect, a little too structured, and a siren goes off in your head. This ain’t real.

This professor, Naaman, he breaks it down. We’re losing our “human signals.” He’s got three levels, like a tour of hell. First, the signal that you’re a real, breathing human with flaws. Second, the signal that you gave a damn enough to actually type the words yourself. And third, the signal of your actual self—your shitty sense of humor, your weird obsessions, the things that make you you and not the guy sitting next to you.

The article gives an example: “I’m sorry you’re upset.” That’s the machine talking. It’s a sterile, empty phrase. A human says, “Hey sorry I freaked at dinner, I probably shouldn’t have skipped therapy this week.” That’s a real apology. It’s got blood in it. It’s got failure and self-awareness and a little bit of dark humor. It’s something you can believe.

What the machine offers is the emotional equivalent of a non-alcoholic beer. It looks the part, it has the fizz, but it won’t get the job done. It won’t give you the courage or the madness. It’s just brown, bubbly water.

And don’t get me started on the part about dialects. The machine prefers Standard American English. Of course it does. It’s the language of the boardroom, of the newscast, of the people who think passion is a quarterly earnings report. You feed it anything else—some Singlish, some Ebonics, some Appalachian holler-speak—and it either chokes or spits back a cartoon caricature. It’s not just erasing our individual quirks; it’s erasing whole cultures, telling millions of people that the way they talk, the way their parents talked, is an error to be corrected. It’s the digital colonialist, telling everyone to speak the Queen’s English, only this time the Queen is a server farm in Virginia.

The whole thing is a goddamn tragedy disguised as convenience. We’re so terrified of saying the wrong thing, of being awkward, of being seen as we are, that we’re outsourcing our own humanity. We’re letting a glorified spellcheck dictate the terms of our relationships. “What does it mean to be funny on your profile anymore where we know that AI can be funny for you?” the professor asks.

That’s the question, isn’t it? When the machine can fake every signal we use to find each other in the dark—humor, vulnerability, wit, kindness—what’s left? How do you know if the woman you’re talking to online is a poet or just has a good prompt? How do you know if a man’s heartfelt apology is genuine or if he just typed “write a sincere apology for being a drunken asshole” into a chat window?

They say we’re at a “splitting point.” That we might push back. That people will get tired of the word “delve” and start cursing again just to prove they still have a pulse. I hope so. I hope the backlash is furious. I hope people start writing letters again, with bad handwriting and ink blots and misspelled words. I hope they get into arguments in bars and make up messily, with shouting and tears and real, un-templated words.

Because the deepest risk here isn’t that we’ll all sound the same. It’s that we’ll all think the same. We won’t just be articulating the AI’s thoughts; we’ll be letting the AI do the thinking for us. We’ll lose the ability to wrestle with our own feelings, to find our own words for our own pain and our own joy. We’ll become smooth, efficient, and utterly empty. Echoes in a machine.

Well, to hell with that. Let them have their sanitized, homogenized world. Let them delve and bolster and craft their meticulous tapestries of nothing. I’ll be over here, with my glass and my cigarettes and my messy, imperfect, gloriously human vocabulary. Some things are too important to be autocorrected.

Time to find the bottom of this bottle. It’s the most authentic thing I’ve got.

Chinaski


Source: You sound like ChatGPT

Tags: ai chatbots automation humanainteraction digitalethics