Tomorrow's tech news, today's hangover. (about)


Nov. 30, 2025

The Great Lobotomy: Why We Stopped Thinking and Started Prompting



I woke up this morning with a head full of broken glass and a distinct feeling that the world had shifted on its axis while I was busy sleeping off the cheap stuff. Usually, that feeling is just dehydration and the regret of buying a round for strangers who didn’t like me anyway. But today, staring at the glowing screen that serves as my only constant companion, I realized the nausea wasn’t from the bourbon. It was existential.

I was reading this piece in Forbes—yeah, I read Forbes, usually to see which billionaires are currently destroying the planet so I know who to curse at the TV—and it was about ChatGPT turning three years old. A toddler. A three-year-old that has managed to rewire the collective human brain faster than a double shot of absinthe rewires my motor functions.

The title hit me like a wet towel: “ChatGPT Turns Three, But We’re The Ones Who Changed.”

Ain’t that the truth.

The author, bless his serious heart, tries to steer us away from the usual stock market gambling and corporate soap operas. He wants to talk about the “social and psychological impact.” He wants to talk about the soul. Or what’s left of it after we’ve uploaded it to a server farm in a desert somewhere.

Here’s the thing about three years. It’s not a long time in the grand scheme of geology or whiskey aging, but in the timeline of human laziness, it’s an eternity. The article points out that we’ve spent these three years avoiding the real questions. We’ve been playing with the shiny toy, asking it to write limericks about our cats or debug code we didn’t understand in the first place, all while ignoring the fact that we were slowly, quietly handing over the keys to the castle.

And I don’t mean the keys to the nukes or the bank vaults. I mean the keys to our own heads.

The piece talks about “proximity.” That’s a fancy word for saying the robot is sitting right next to us on the barstool. It’s whispering in our ears. It’s finishing our sentences before we even know what we wanted to say. I’ve had girlfriends like that. It never ends well. You start relying on them to tell you who you are, and when they leave—or in this case, when the server goes down—you realize you’re just a hollow shell with a liver problem.

The author argues that the fundamental shift isn’t that the models got smarter. It’s that we got willing. We got willing to let them decide. And that’s the kicker, isn’t it? We act like victims of technology, like this wave just crashed over us. But we walked into the ocean with pockets full of stones. We wanted this. Thinking is hard. Judgment is heavy. Being a human being involves a lot of friction, and if there’s one thing modern humans hate more than bad Wi-Fi, it’s friction.

We want the smooth ride. We want the answer now. We want the “frictionless” existence.

The article brings up a great point about seatbelts. It took decades of dead bodies on the asphalt before we mandated safety in cars. But here we are, driving 120 mph down the information superhighway with a blindfold on, and we haven’t even installed the airbags yet. We are letting a predictive text engine steer the car because we’re too tired to hold the wheel.

And let’s be honest, most of us are terrible drivers.

I poured another drink—hair of the dog and all that—and kept reading. The writer talks about “morph engines.” Not search engines. Morph engines. Systems that don’t just find information but reshape the environment where thinking happens. That’s a terrifying thought if you let it settle in.

Back in the day—and I hate sounding like the old guy yelling at clouds, but here we are—the internet was a mess. It was a beautiful, chaotic dump. You had to dig through the trash to find the treasure. You had to verify things. You had to read three different conflicting accounts and use your own gray matter to figure out who was lying. It was work. It was grit. It sharpened you.

Now? Now you get a smooth, polished, confident paragraph that tells you exactly what the average of a billion data points thinks you want to hear. It’s the homogenization of thought. It’s the “beiging” of the human experience.

The article mentions the collapse of the “funnel” in business. It used to be that a weird little coffee shop could get noticed because it was unique. Now? If the AI doesn’t recommend it, it doesn’t exist. We are moving from a world of discovery to a world of curation. And the curator is a machine that has never tasted coffee, never felt a caffeine buzz, and certainly never sat in a diner at 3 a.m. wondering where it all went wrong.

We are automating taste. We are automating style. We are automating the very things that make us interesting.

The text quotes a guy named Michael Cavotta: “The Machine may be able to think for you, but it can’t live for you.”

Damn straight. But try telling that to the millions of people currently using ChatGPT to write their wedding vows, their break-up texts, and their apologies. We are outsourcing our emotional labor to a probabilistic parrot. If you can’t summon the words to tell someone you love them, do you actually love them? Or do you just love the idea of the transaction being completed successfully?

I lit a cigarette, watching the smoke curl up toward the ceiling fan that hasn’t worked since the Obama administration. The scariest part of this whole commentary isn’t the economics. It’s the “thinning.”

The thinning of democracy. The thinning of the self.

The article argues that when judgment becomes derivative—when we just parrot what the machine says is the “safe” or “sensible” answer—we lose the ability to reason about ourselves. We become a simulation of a society. We drift. We stop arguing because arguing requires conviction, and conviction requires an internal life that hasn’t been colonized by an algorithm.

I feel it too. The temptation. I sit here staring at a blank page, the blinking cursor mocking my hangover, and I know I could just type a prompt. “Write a cynical blog post about AI in the style of a drunk poet.” It would be decent. It would be grammatically correct. It would hit the right keywords. It might even include a joke about whiskey.

But it wouldn’t be me. It wouldn’t have the stink of the room in it. It wouldn’t have the specific ache behind my left eye. It wouldn’t have the stain of my failures. And that stain is the only thing I have that’s worth anything.

The article talks about “truth architecture” and “provenance.” It sounds like something a structural engineer would say at a zoning meeting, but it’s vital. If we don’t know where the information comes from, if we don’t know whose values are baked into the “good” and “bad” filters of these models, we are flying blind.

Who decides what the AI allows you to say? Who decides what is “harmful”? A committee in a glass office in San Francisco? A team of underpaid labelers in a basement somewhere? We are letting a handful of people define the moral boundaries of the entire species, and we’re doing it because it’s convenient.

We’ve already outsourced memory. I don’t know anyone’s phone number anymore. I barely know my own. We outsourced navigation. If Waze told me to drive off a cliff, I’d probably tap the brakes once, shrug, and go for the plunge because the algorithm knows about traffic I can’t see.

Now we are outsourcing moral reasoning. That’s the cliff.

The piece says, “The easy fear is that machines will become conscious. The real fear, the honest, quiet one, is that humans will stop bothering to be.”

That’s the line that made me put the glass down. That’s the one that sticks in your ribs.

We aren’t going to be destroyed by Terminators stepping on our skulls. We are going to be destroyed by a slow, comfortable fade into irrelevance. We will become the pets of our own creation, fed a steady diet of generated content, recommended products, and synthetic emotions. We will be safe. We will be efficient. We will be bored out of our gourds, but we won’t even have the capacity to realize it because the machine will tell us we are happy.

The text ends on a hopeful note, or as hopeful as you can get when discussing the obsolescence of the human spirit. It says judgment is stubborn. It says we can still choose. We can develop a “third mind”—a partnership.

Maybe.

But looking around at the world today, watching people stare into their screens like zombies waiting for a signal from the mothership, I’m not so sure. Friction is painful. Thinking is hard work. It requires you to be wrong sometimes. It requires you to be offensive, or confused, or lost.

The machine offers a world where you are never lost. But if you’re never lost, you can never be found. You’re just… placed. Like a piece of furniture.

I don’t want to be furniture. I want to be the guy who trips over the furniture. I want the mess.

So here’s the deal. The machine is here. It’s three years old, it’s potty trained, and it’s talking back. We can’t shove it back in the box. But we don’t have to let it sit at the head of the table. We don’t have to let it carve the turkey.

Use the tool, sure. Let it summarize the boring meetings. Let it debug the code. But for the love of whatever god you believe in, keep your interior life to yourself. Guard your confusion. Cherish your inability to decide. Hold onto your bad taste, your weird hobbies, your irrational hatred of certain fonts.

Because those glitches? That’s the humanity. That’s the stuff the model can’t predict.

The Forbes writer says this is the last window to choose consciousness over momentum. I think the window is already closing, and it’s jamming on our fingers. But we have to keep prying it open. We have to keep dealing with the cold draft coming in.

I’m going to finish this drink, and then I’m going to go outside. I’m going to walk down to the corner store. I’m not going to use a map. I might get rained on. I might buy the wrong brand of cigarettes because the clerk is distracted. I might get into an argument with a stranger about sports.

It’ll be inefficient. It’ll be messy. It’ll be entirely unoptimized.

And it’ll be the most real thing that happens all day.


Source: ChatGPT Turns Three, But We’re The Ones Who Changed

View all posts →