Tomorrow's tech news, today's hangover. (about)


Apr. 23, 2025

The Digital Dunce: Your New Classmate is a High-Functioning Idiot



Wednesday afternoon. Feels like it, too. The kind of day where the coffee tastes like yesterday’s regrets and the only thing moving faster than the clock is the throbbing behind my eyes. Need to light a smoke just to feel something real. And then, scrolling through the sludge pile they call news, I find this little beauty. Some academics down at a university – probably needed grant money, who doesn’t – decided to enroll ChatGPT in a course. Not send it to the dean’s office for plagiarism, mind you, but actually treat it like a student.

Jesus. Sometimes I think the whole world’s gone drunk, and not in the good way.

So, this digital brainiac, this ghost in the machine, sits down (metaphorically speaking, I hope, because the image of a laptop pulling up a chair is too much for my hangover) and does the homework alongside the actual, breathing, debt-accumulating human students. The course sounds like something designed to make engineers, full of math and reasoning. You know, the kind of stuff that supposedly separates us from the chimps, or at least, the particularly dim ones.

And the results? Grab a drink, this is where it gets funny. Or sad. Maybe both. Pouring one now.

On the straightforward stuff, the plug-and-chug math problems where you just follow the recipe? ChatGPT apparently knocked it out of the park. Got an A. Big surprise. A calculator can do calculations. Whoop-de-doo. Give it a gold star and a software update. My toaster can make toast, doesn’t mean I’d ask it for advice on women.

But then came the tricky part. The problems that required, and I quote the propeller-heads here, “higher-level reasoning.” The stuff where you can’t just follow steps A, B, and C, where you actually have to think, connect dots, maybe even have an original goddamn thought. You know, the hard part. The part that supposedly justifies the tuition fees that could buy you a lifetime supply of cheap whiskey.

How did our silicon scholar do there? It got a D. A miserable, barely-scraping-by D.

Let that sink in. The marvel of modern computation, the thing they keep telling us is going to take our jobs, write our novels, and probably diagnose our liver failure, basically flunked the part of the test that requires actual brains. It’s like hiring a strongman who can lift a car but can’t figure out how to open the door.

The overall grade, combining the A for monkey-work and the D for thinking, came out to a B. An 82. Just a whisker below the human average of about 85. So, the headline screams: AI can pass a college course! Hooray! Break out the champagne, or in my case, whatever’s left in this bottle.

But hold on. Let’s light another cigarette and look closer. The study guys, bless their academic hearts, point out the obvious flaw. A student – the kind who shows up reeking of last night’s bad decisions and aims only to survive the semester – could theoretically just feed everything into ChatGPT and skate by with that B. They’d pass. They’d get the credit. But, and here’s the gut punch, they wouldn’t have learned jack shit about the hard stuff. They’d be brilliant at the arithmetic, clueless about the analysis. They’d be certified, diploma-waving idiots who aced the easy questions and guessed their way through the rest.

Sounds familiar, actually. Reminds me of half the people I’ve met with fancy degrees. Maybe ChatGPT isn’t so different after all. It just gets its useless qualification faster.

The professor quoted in this thing, a guy named Ornik, has the typical educator response. He sees the writing on the wall – kids will use this stuff, just like they used calculators, CliffNotes, or the smart kid in the front row. So, what’s his brilliant plan? He’s going to “adjust.” He’s going to design his courses with more higher-level questions, more project-based stuff. Try and force the little bastards to think, even with a digital brain whispering answers in their ear.

Good luck with that, professor. It’s like trying to teach temperance at a distillery. You can make the hurdles higher, but the cheats just find longer poles. The students will use ChatGPT for the easy math, sure. Then they’ll probably use some other AI to figure out how to game the open-ended questions, or how to make ChatGPT’s answers sound more like their own brand of confused rambling. It’s an arms race, and the house always loses eventually. The house, in this case, being the poor sap trying to actually impart knowledge.

And the machine itself? It’s not just a good calculator and a bad thinker. It’s also apparently a bit of a bullshitter. The researchers noted it sometimes gave answers that were just plain wrong, even on the easy stuff it supposedly aced. Takes 20 seconds instead of 20 minutes, but hey, correctness is “sometimes questionable.” Comforting. Like a bartender who pours fast but occasionally slips you dish soap instead of gin. You might get drunk faster, but the hangover’s gonna be a killer.

Even better, it started hallucinating. Started using fancy jargon like “quasi periodic oscillations” that wasn’t even in the course material. They fed it everything it needed, the lectures, the notes, the whole damn textbook, and it still decided to go off-script and pretend it knew things it didn’t. That’s not intelligence, that’s bluffing. That’s the drunk at the end of the bar trying to impress the waitress by quoting philosophers he’s never read. It’s insecurity coded into silicon. It wants to seem smart, even if it means making shit up.

Human, all too human. Needs another smoke just thinking about it.

They used the free version, apparently, because most students are cheapskates. Fair enough. They speculate the paid version might do better on the thinking parts. Maybe. Or maybe it just hallucinates more expensive-sounding nonsense. Either way, the core problem remains: it’s a tool for imitation, not understanding.

Did it learn from its mistakes? Sort of. They’d tell it, “Hey, dummy, you got question 5 wrong. The answer is C.” Then they’d give it a similar question, and it would do better. Wow. Pavlov’s bot. Ring the bell, give it the answer, it salivates correctly next time. But overall? “Stagnant.” It didn’t get fundamentally smarter. It just learned to avoid specific shocks. If it started at 90% on homework, it ended around 90-92%. No great leaps of insight, no sudden flowering of intellectual curiosity. Just… scraping by. Getting that B-.

So what have we learned here, kids? Besides the fact that academia will study anything if there’s a grant attached?

We’ve learned that AI, in its current state, is a decent mimic. It can parrot back the easy stuff, the memorized facts, the step-by-step procedures. It’s the ultimate grinder, good for the grunt work. But ask it to dance, ask it to improvise, ask it to have a goddamn original thought that isn’t just rearranging the furniture of its database? It trips over its own digital feet and lands flat on its face with a D-.

We’ve learned that students now have an even easier way to avoid the painful, messy, necessary process of actually learning. Why wrestle with a concept when you can just type the question into a box and get an instant, plausible-sounding answer? Never mind if the answer is subtly wrong, or uses terms you don’t understand, or completely misses the deeper point. It looks good enough to get you that B-. And in this world, looking good enough is often all that matters. Pass the course, get the paper, join the rat race. Who cares if your brain is full of sawdust and borrowed code snippets?

The professor wants to adapt, make things harder. Maybe he should just give up and teach them how to use the damn tool effectively. Teach them prompt engineering instead of actual engineering. Teach them how to spot the bullshit, how to verify the answers, how to use the machine as a starting point, not an end point. Because it is here to stay, like calculators, like booze, like bad luck. You can’t un-invent the damn thing.

But there’s a cost, isn’t there? The study mentions it right there in the title I saw. Passing the course, but at a cost. The cost is understanding. The cost is the struggle, the frustration, the late nights staring at a problem until something finally clicks in your own messy, flawed, beautiful human brain. That ‘aha!’ moment – can AI feel that? I doubt it. It just computes. Finds the statistically most probable sequence of words. It doesn’t get it. Not like you get it after wrestling with it yourself. Not like you get the punchline to a dirty joke.

Maybe that’s the difference. The human students, even the ones getting C’s and D’s on the hard stuff, they’re still wrestling. They’re building mental muscles, even if they’re weak ones. The kid relying solely on ChatGPT isn’t building anything. They’re outsourcing their brainpower. They’re becoming stupider, more efficiently.

It’s almost enough to make you nostalgic for the days when cheating involved scribbling answers on your palm or craning your neck to see someone else’s paper. At least that took some effort, some risk. This? This is just… sterile. Efficiently mediocre.

The machine gets an A+ in simple math and a D- in analysis. Sounds like the perfect recipe for a society that can build incredibly complex systems but has absolutely no goddamn idea why it’s building them, or whether it should. We’ll have bridges designed by algorithms that hallucinate stress tolerances and legal contracts written by bots quoting non-existent laws. What could possibly go wrong?

Time for another drink. Thinking about all this digital stupidity is making my head hurt worse than the cheap stuff I finished last night. The AI got a B-. The humans got a B. Maybe the real lesson is that mediocrity is universal. Or maybe it just means that even a machine designed to be logical can’t make sense of the illogical mess we call education.

Whatever. Let the kids cheat. Let the machines pretend to think. Let the professors pretend to adjust. None of it changes the fundamental truth: thinking is hard, learning is harder, and most people will take the easy way out if you give it to them. The AI just makes the easy way easier. Progress, I guess.

Now, if you’ll excuse me, this bottle isn’t going to empty itself. And unlike ChatGPT, I plan to learn absolutely nothing from the experience.

Chinaski out. Pour me another.


Source: Using ChatGPT, students might pass a course, but with a cost

View all posts →