I used to work the post office. Sorting letters. You get fast at it — zip codes blur into muscle memory, your hands know where things go before your brain does. But every now and then you’d hold an envelope and feel something. A lump. A kid had mailed a rock to his grandmother. A woman had tucked a dried flower inside a birthday card. You’d feel it through the paper and for half a second you’d think about the person on the other end.
Nobody at DOGE felt anything through the paper.
They walked into the National Endowment for the Humanities with a ChatGPT prompt and a mandate to kill whatever smelled like diversity. The prompt was beautiful in its brutality: Does the following relate at all to D.E.I.? Respond factually in less than 120 characters. Begin with ‘Yes’ or ‘No.’
That’s it. That’s how you decide whether a research project lives or dies in America now. Not a panel of scholars. Not even a bored bureaucrat flipping through files with a coffee stain on his tie. A chatbot. A yes-or-no from a machine that doesn’t know what the humanities are, doesn’t care, and was trained on the internet — which, if you’ve spent five minutes on the internet, should terrify you.
They didn’t even read the grants. They pulled short summaries off the web and fed those to the bot. Summaries of summaries judged by a machine that generates summaries. It’s abstraction all the way down, like a game of telephone played in a language nobody speaks.
The results were “sweeping and sometimes bizarre.” That’s the New York Times being polite. What they mean is the machine flagged a project on the history of rice cultivation as DEI because it mentioned Asia. Flagged a study of ancient Greek philosophy because the abstract included the word “diverse.” The chatbot doesn’t understand context. It pattern-matches. It sees the letters D-E-I hiding inside words like “medieval” and pulls the trigger.
And nobody checked. That’s the part that gets me. Nobody sat down and said, wait, this is a grant about fourteenth-century Italian banking — maybe the machine got it wrong. Nobody felt the lump in the envelope. They just sorted and shredded, fast as the algorithm could spit.
There’s a book I think about sometimes — Ray Bradbury’s Fahrenheit 451. Everyone remembers the firemen burning books. But the real horror isn’t the burning. It’s that by the time they start, nobody cares anymore. The culture has already rotted from the inside. People stopped reading on their own. The firemen are just the cleanup crew. Bradbury wrote that in 1953, sitting in the basement of the UCLA library, feeding dimes into a rented typewriter. He wrote a book about the death of books in a library, on a machine he rented by the hour because he couldn’t afford one. There’s a kind of poetry in that the AI will never understand, because poetry requires knowing what it costs.
What DOGE did isn’t Fahrenheit 451. It’s worse in a way. At least Bradbury’s firemen showed up. They put on uniforms, drove the truck, carried the kerosene. There was a ritual to it, an acknowledgment that what they were destroying had enough power to be dangerous. DOGE couldn’t even be bothered with ceremony. They copy-pasted. They prompted. They automated the contempt and went to lunch.
I keep thinking about the people on the other end. A professor in Wisconsin who spent three years writing that proposal. A team of archivists preserving oral histories of coal miners in Appalachia — those miners, by the way, being the exact kind of working-class Americans these politicians claim to worship every election cycle. Gone. Because a chatbot said “Yes” in under 120 characters.
The machines aren’t the problem. They never are. A hammer doesn’t know the difference between a nail and a skull. The problem is the hand holding it, and right now the hand belongs to people who think understanding the past is a luxury they can’t afford. Who see a scholar studying medieval trade routes and think what’s the ROI on that? As if every worthwhile thing in civilization came with a business case. As if the cathedral builders ran the numbers first.
The humanities are what teach you to read a situation, not just a summary. To ask why, not just what. To sit with ambiguity long enough to learn something instead of collapsing it into a binary the way that prompt did. Yes or No. Does this relate to DEI? As if an entire field of study — the field that gave us every novel, every history, every philosophy that ever made a person stop and think — can be sorted like mail.
I sorted mail for years. I know what happens to the letters nobody reads. They pile up. They yellow. Eventually someone throws them away and pretends they were never important. But the lump in the envelope was real. The rock, the dried flower, the thing someone took the time to put inside — that was real. And the fact that nobody at DOGE will ever feel it isn’t a feature of the future. It’s the whole goddamn diagnosis.
Somewhere tonight, a woman who spent six years studying how enslaved people in the antebellum South used folk songs to encode escape routes is reading a form rejection. The algorithm that killed her grant doesn’t know what an escape route is. Doesn’t know what a song costs. Doesn’t know what it means to encode your freedom in a melody because writing it down would get you killed.
But it knows how to say No in under 120 characters. And that was enough.
Source: DOGE used ChatGPT to gut the National Endowment for the Humanities