Alright, so the world’s buzzing again. Some new goddamn thing. This time it’s AI in schools. You got the suits in their air-conditioned offices, probably sniffing their own farts and calling it “innovation,” yammering about “disruption” and “guardrails.” Sounds like a bad night at the dog track. Then you got the actual kids, the ones down in the trenches, just trying to get their history paper done before the deadline slams shut like a bar door at 2 AM.
And guess what? The two conversations are about as related as a poet and a paycheck.
But then I read this piece about some kid, William Liang. High school journalist, they say. Kid’s got more ink than a sailor and more sense than most of the blowhards I used to write ad copy for. He’s living it, breathing it, probably coughing it up like the rest of us after a rough night. And he lays it out straight, no chaser: the whole damn school system is playing checkers while the kids are on some next-level, 3D chess shit with this AI. Our way of “teaching” and “testing”? Broken. Like a promise from a politician or a cheap watch.
First shot of truth from this Liang kid: for most students, an assignment isn’t some noble quest for knowledge. Hell no. He says, and I’m paraphrasing through the bottom of my glass here, “For most students, an assignment is not interpreted as a cognitive development tool, but as a logistical hurdle.” A hurdle. Something to get over, get around, or just bulldoze through so you can get back to the real business of being young and miserable. And right now, the bulldozer is this generative AI.
Think about that. It ain’t about the kids being lazy bastards or morally bankrupt. It’s about them being smart. We built the game. For decades, the whole rotten system’s been screaming one thing: GRADES. Get the A. Doesn’t matter if you understand a damn thing, just get the A. So when a tool comes along that spits out A-grade crap in ten minutes instead of ten hours of soul-crushing boredom… well, what do you expect? It’s like offering a parched man a canteen of water or a bottle of flat beer. He’s gonna take the water, even if the beer has more character.
Liang puts it plain: “If there’s an easy shortcut, why wouldn’t we take it?” It’s human nature, ain’t it? When the game’s rigged, when the pressure’s on, and you can cheat with a big upside and a tiny risk of getting your knuckles rapped? Hell, even the saints would be tempted. Pretty soon, everyone’s doing it just to stay in the race. Like rats in a maze, all scrambling for the same piece of cheese, and now one rat’s got a tiny goddamn robot helping him.
So what about the teachers, huh? The plagiarism checkers? The honor codes? All that high-minded horseshit. According to this kid, it’s mostly “security theater.” A puppet show for the parents and the school board. The whole enforcement thing is, his word, “incoherent.” Just lit another cigarette thinking about that. Incoherent. Like most conversations I have before my third coffee, or after my fifth whiskey.
He says kids get “warned” all the time, but actual punishment? Rare. The detection tools are dumber than a sack of hammers, looking for patterns that any kid with half a brain learns to dodge faster than a bar tab. Teachers are overworked, underpaid, and probably need a drink more than anyone. They ain’t got time to play detective unless some little genius turns in something that screams “I AM A ROBOT.”
And even then… Liang tells this story, makes you wanna laugh or cry or just order another double. Some kid literally left the words “as an AI language model myself” in his goddamn essay. And nobody noticed. The AI confessed, right there in black and white, and it sailed through like a love letter from a ghost. Meanwhile, some other poor schmuck who actually sweated over his essay for a week gets flagged and has to show his work like he’s defending a PhD thesis. Justice, eh? About as common as a sober poet.
This is where it gets good, where the kid really twists the knife in the flabby gut of the establishment. He says we’re using the wrong words. “Cheating.” “Integrity.” Bullshit. In the real world, the one where the rent’s always due and the bottle’s always half-empty, it’s about one thing: Can you get caught? “The designation of ‘cheating’ doesn’t rest on the method but on the detectability,” he argues. And since detection is a crapshoot, the whole damn idea of “legitimate” use just dissolves like an Alka-Seltzer in last night’s whiskey.
So if the old game is busted, what’s left? Blow up the board, says Liang. He’s got a solution. Not more software, not more lectures about being a good little drone. One simple, beautiful, brutal rule. Get this: “Teachers should not be allowed to assign take-home work that ChatGPT can do. Period!” Read that again. Let it sink in. He’s not saying “no homework,” though God knows that’s a tempting thought. He’s saying if the damn robot can do it unsupervised, it’s not a test of what the kid knows. It’s a test of how good they are at bossing a machine around. Which, come to think of it, might be a useful skill in the coming apocalypse, but not what they’re supposedly teaching.
The real work, the thinking, the bleeding onto the page, the wrestling with ideas until your brain aches – that’s gotta come back into the classroom. Where you can see the sweat. See the gears turning, or not turning, as the case may be. In-class essays, he says. Oral exams. Stuff where you gotta stand and deliver, not just copy-paste. The old ways. Funny how they sometimes turn out to be the best ways, like a well-aged bourbon or a dog-eared book. You watch them do it. No hiding.
But here’s the kicker, the thing that makes you almost choke on your smoke. The kid’s not some doomsayer, wringing his hands about the end of humanity. He’s actually optimistic about this AI crap. He just thinks the adults are pointing it at the wrong targets, like a drunk aiming for the urinal. “There is no inherent tension between embracing AI and preserving critical thinking or creativity, unless schools force one,” he says. The tool ain’t the problem. It’s the task, you dimwits.
He flips it. Imagine, he says, if kids could apprentice with the digital ghosts of Hemingway, Newton, da Vinci. An AI that tells you your story sucks, but tells you why it sucks, like a brutally honest drinking buddy. An AI tutor that can drill you on math problems until your eyes bleed, tailored to your own special brand of stupidity, available 24/7. Now that’s a use for the damn thing. AI doesn’t do the work for you; it helps you do the work better. Maybe. Or maybe it just makes us all better at pretending we’re smart. One example he gives: grade kids on a conversation they have with an AI about something complicated. The kid still has to do the thinking. The AI is just the smartest, most patient goddamn sparring partner you ever had.
So, this Liang kid. He’s a wake-up call. For the principals and the parents and the people who think they’re leading anything. Stop asking how to catch the little bastards. Start asking what the hell you should be asking of them in the first place. What’s the point of making them jump through hoops a machine can jump through better, faster, and without bitching about it?
The kids, he says, are “light-years ahead” of the adults. Using ChatGPT more than Instagram. Christ. It’s like we’re all stuck in a black-and-white movie, and they’re already living in color, probably with a better soundtrack. The future’s here, it’s messy, and the kids are already mainlining it. The rest of us are just trying to find the instruction manual, probably written by another AI.
My bottle’s looking low. This whole thing’s given me a thirst. Maybe there’s hope, maybe it’s all just another grift. Who knows. But the kid’s got a point. Now, if you’ll excuse me, I think I hear a shot glass calling my name. It’s probably got more answers than all the experts combined.
Chinaski out. Time for a refill. Or three.
Source: What’s Really Going On With AI In Schools? A High School Student’s POV