Our Robot Overlords are Drunk at the Wheel (of Justice)

May. 27, 2025

Alright, settle in, pour yourself a stiff one. Or don’t. More for me. The world’s gone collectively nuts, and the machines are just learning to ape our particular brand of insanity. You think your Monday morning is rough? Try being Frankie Johnson, stuck in an Alabama correctional facility, apparently doubling as a human pincushion, while the high-priced legal eagles hired to defend the state’s glorious penal system are off playing make-believe with a goddamn chatbot.

The headline practically writes its own punchline: “Alabama paid a law firm millions to defend its prisons. It used AI and turned in fake citations.” Millions. For this. I need another drink just thinking about it. And a cigarette. Where the hell are my smokes?

So, this poor bastard Johnson. Guy says he got turned into a shish kebab about twenty times. Once, handcuffed to a desk. Another time, in the prison yard with guards allegedly watching, maybe even egging the other guy on. Sounds like a real goddamn garden spot, that William E Donaldson prison. Johnson, understandably, decided this wasn’t the all-inclusive vacation package he’d signed up for and sued the Alabama prison officials. Standard stuff, really. Violence, understaffing, corruption – the usual bingo card for a state-run hellhole.

Enter Butler Snow, the law firm. The state’s go-to guys for defending their… troubled prison system. These aren’t some ambulance chasers working out of a strip mall, mind you. Millions of taxpayer dollars flow into their coffers. Millions. For “experience.” Specifically, William Lunsford, head of something called the “constitutional and civil rights litigation practice group.” Sounds impressive, don’t it? Like something you’d engrave on a gold-plated commode.

But here’s where the script flips from grim reality to outright farce. One of Lunsford’s legal prodigies, a fella named Matthew Reeves, decided to spice up some court filings with a little help from our friend, Artificial Intelligence. Specifically, ChatGPT. You know, the robot brain that can write you a sonnet about your cat or explain quantum physics like you’re five. Turns out, it’s also pretty good at just… making shit up. Legal shit, in this case. The AI coughed up a bunch of case citations to back up their arguments. Beautiful, official-looking citations. Only problem? They were phantoms. Ghosts in the legal machine. As real as my chances of winning the lottery or finding a good woman who understands the sacred bond between a man and his bourbon.

This ain’t some isolated incident, either. Some poor academic is actually trying to track these “AI hallucinations” in court documents. He’s found over a hundred globally. A hundred. Lawyers, the supposed guardians of fact and precedent, are now just typing prompts into a digital magic eight ball and hoping for the best. “Will my client get off scot-free, oh wise ChatGPT?” Reply hazy, try again, and here are some entirely fictional cases to support your bullshit.

Last year, some shyster in Florida got his license dinged for a year for pulling this stunt. Another firm in California had to cough up 30 grand. Pocket change for these operations. The judge in Johnson’s case, Anna Manasco, she’s got a bit more fire in her belly. She looked at these prior wrist-slaps and basically said, “Clearly, that didn’t fucking work, because here you are, you clowns, doing it again.” Proof positive, she called it. No argument here, Your Honor. Pour the lady a double.

The Butler Snow attorneys, caught with their digital pants down, were “effusively apologetic.” Oh, I bet they were. Like a kid caught with his hand in the cookie jar, crumbs all over his face, swearing the dog did it. They even trotted out the old “we have a policy” defense. A firm policy requiring approval to use AI for legal research. Right. And I’ve got a policy to only drink on days ending in ‘Y’. We all see how well these policies hold up under pressure.

Reeves, the guy who actually did the deed, tried to fall on his sword. “I was aware of the limitations… I did not comply… I would hope your honor would not punish my colleagues.” Noble. Or maybe just trying to keep the whole damn firm from going under. He said he knew about ChatGPT, needed some quick backup for what he “believed to be well-established points of law.” So he asked the robot, and it “immediately identified purportedly applicable citations.” But in his “haste to finalize the motions and get them filed,” old Matty “failed to verify.” Haste. We’ve all been there, haven’t we? Rushing to meet a deadline, cutting corners. Difference is, when I cut corners, maybe a blog post has a few extra typos. When these guys do it, justice takes a flying leap out the window, and some poor sonofabitch like Frankie Johnson twists in the wind.

And what were these earth-shattering legal points they needed fake cases for? A scheduling dispute about deposing Johnson. They wanted to grill him, his lawyers said, “Hold up, you haven’t given us the documents you owe us.” Butler Snow, armed with their AI-forged Excalibur, countered that case law mandated Johnson be deposed expeditiously. “The Eleventh Circuit and district courts routinely authorize incarcerated depositions,” they declared, listing four cases as proof. All bullshit. One of them, Kelley v City of Birmingham, sounded legit, until Johnson’s lawyers pointed out the only real case with that name was from 1939 and involved a goddamn speeding ticket. A speeding ticket. The sheer, unadulterated gall. It’s almost beautiful in its awfulness. It takes a special kind of talent to fuck up that spectacularly.

The judge herself had her people look for these phantom cases. Nada. Zilch. The digital cupboard was bare. Reeves, in his mea culpa to the court, admitted he was reviewing drafts from a junior colleague and just wanted to sprinkle in some citations. Like a chef adding a little parsley, only the parsley was poison.

This Damien Charlotin fellow, the academic tracking this digital plague, says he’s “seeing an acceleration.” No kidding. It’s like a race to the bottom, and everyone’s got their foot on the gas. He also notes that courts have been “remarkably lenient,” unless the lawyers try to bullshit their way out of it after getting caught. Small comfort.

And the kicker? The state of Alabama, through its Attorney General’s office, was asked if they’d still stick with Butler Snow after this masterclass in incompetence. The answer? “Mr Lunsford remains the attorney general’s counsel of choice.” Of course, he does. Why wouldn’t he? They’re probably all laughing about it over cigars and brandy, while Frankie Johnson is wondering which part of his anatomy is next on the stab-happy inmate’s agenda. This isn’t just about one firm, either. Lunsford and Butler Snow have contracts for a slew of these civil rights cases against the Alabama Department of Corrections, including one brought by the damn Department of Justice. One contract alone was worth nearly $15 million. Fifteen million dollars for… well, for this, apparently. Some lawmakers are questioning the spending. You think?

It’s enough to make a man question the whole damn shooting match. We’re building these incredible thinking machines, these marvels of logic and data, and what’s the first thing we teach them? How to lie for lawyers. How to confabulate, to hallucinate, to pull “facts” out of their digital asses to prop up whatever argument some overpaid suit needs to make. It’s not even sophisticated evil; it’s just lazy, slipshod bullshit, amplified by circuits and code.

They talk about AI taking our jobs. Hell, at this rate, it’ll take our ability to tell truth from fiction, reality from a well-formatted lie. These geniuses are so busy trying to make machines think like humans, they forgot that a lot of humans are idiots, or charlatans, or just plain tired and sloppy. And now the machines are learning from the worst of us.

The firm’s response is “not complete yet,” Lunsford mumbled. You don’t say. I imagine there’s a lot of frantic back-pedaling, a lot of “how do we spin this clusterfuck?” meetings going on. Maybe they can get ChatGPT to write their apology letter. Just make sure someone, anyone, bothers to read it before they hit send.

It’s the sheer, breathtaking arrogance of it all. The idea that you can just conjure up legal precedent because you’re in a hurry. Because actually doing the work, the painstaking research, the thinking – that’s too much trouble. Easier to let the robot dream up some nonsense and hope no one notices. And for a while, maybe no one did. How many other briefs, how many other cases, are floating around out there, built on a foundation of digital sand?

This isn’t about the robots being evil. The robot didn’t decide to lie. It just did what it was asked, in its own flawed, artificial way. It gave the user what it thought the user wanted. The problem, as always, is us. The meat puppets holding the leash. The ones who are supposed to know better, who are paid fortunes to know better, and still choose the path of least resistance, the shortcut through the fields of bullshit.

Frankie Johnson, meanwhile, is still in that Alabama prison. Still waiting for some semblance of safety, of justice. And the people paid to ensure the system, even a flawed one, functions with some integrity are busy explaining to a federal judge why their homework was eaten by a rogue algorithm.

You can’t make this stuff up. Or, well, apparently you can, if you’re a lawyer with access to a large language model and a flexible relationship with the truth.

The judge is mulling over sanctions. Fines, mandatory classes, maybe even a suspension. Whatever she decides, it’s probably not enough. The rot runs deep. You can’t just patch a system this broken with a slap on the wrist. It needs a full goddamn teardown.

But hey, what do I know? I’m just a guy on a barstool, watching the circus through the bottom of a glass. At least my hallucinations have the decency to be interesting, and they don’t usually end up in federal court. Usually.

Time for another. The world’s not going to get any saner on its own.

Chinaski out. Now, where’s that bottle…


Source: Alabama paid a law firm millions to defend its prisons. It used AI and turned in fake citations

Tags: ai chatbots digitalethics aigovernance automationbias