The neighbor’s dog got loose again last Tuesday. Big stupid thing, part lab, part whatever was available. It ran straight into traffic on Fifth and just stood there, right in the middle of the lane, while a delivery truck locked its brakes and laid on the horn.
The dog didn’t move. Not because it was brave. Because it didn’t understand what was coming.
I thought about that dog when I read about Anthropic telling the White House to go to hell.
Not those words exactly. They used the polished corporate version — “no amount of intimidation or punishment will shift our opposition.” But strip away the legal language and what you’ve got is a company standing in the middle of traffic, telling a truck to stop.
The truck, in this case, being the President of the United States, who decided that every federal agency should immediately stop using Anthropic’s technology. All of it. Overnight. Because the company behind Claude — the AI assistant half of Washington was quietly using to draft memos and summarize briefings — refused to let the military use it for mass domestic surveillance and fully autonomous weapons.
A company that builds artificial intelligence said no to building killing machines. And the government’s response wasn’t “let’s talk about this.” It was punishment. Immediate, total, scorched earth.
There’s a guy I used to drink with, name of Morrison, who worked defense contracting in the ’80s. He told me once that the trick to surviving in that business was simple: never say no. Not to the generals, not to the budget people, not to anyone with stars on their shoulders. You say yes, you cash the check, you go home and watch television and try not to think about what the yes bought.
Morrison died of a heart attack at sixty-one. At his funeral, his ex-wife said he was a good provider. That was the best thing anyone could say about him.
Anthropic is doing something Morrison never could. They’re saying no. And I want to believe it matters, I really do. But I’ve been around long enough to know what happens to the ones who tell the government to go fuck itself.
The court challenge will be interesting. In the legal sense of the word, which means expensive and slow and ultimately decided by people who’ve never written a line of code or fired a weapon. Lawyers will argue about procurement regulations and executive authority and constitutional limits while the actual question — should we build machines that decide who lives and who dies without a human in the loop — gets buried under procedure.
That’s how they do it. They don’t debate the morality. They debate the process. By the time you’ve litigated your way to a ruling, the world has moved on and someone else has already built the thing you were trying to prevent.
There’s a passage in Céline — I think it’s Journey to the End of the Night — where Bardamu watches soldiers march toward a battle they know they can’t win. Not because they’re brave. Because the machine they’re inside of doesn’t have a reverse gear. You go forward or you get crushed by the people behind you.
That’s what the defense industry is. A machine with no reverse gear. And here’s Anthropic, trying to install one.
The White House called it a matter of national security. Of course it did. Everything is a matter of national security when you want to skip the part where someone asks questions. The phrase has been stretched so thin you could read a newspaper through it. National security is why we need cameras on every corner. National security is why the algorithm needs your messages. National security is why a machine should be allowed to decide, in a fraction of a second, whether the shape on the screen is a threat or a teenager with a backpack.
I keep coming back to the word “unfettered.” That’s what the White House wanted. Unfettered access. Not supervised. Not limited with human oversight and someone whose job it is to say “wait.” Unfettered. No leash. No limits. Just open the door and let us in and don’t ask what we do with it.
The last time I heard someone use the word “unfettered” with that much enthusiasm, it was a real estate developer talking about zoning laws. He wanted to build condos on a wetland. The birds didn’t get a vote.
Maybe Anthropic folds eventually. Money has a way of rearranging principles. Shareholders don’t sleep well when the government is your biggest enemy instead of your biggest customer. Some board member will point to the revenue charts and say we can’t afford morality, not in this economy, not with competitors who don’t have our hang-ups about autonomous kill chains.
Or maybe they hold. Maybe this is the one time the dog in the road actually stops the truck. I wouldn’t bet my rent on it, but I’ve been wrong before. Usually about horses and women, but sometimes about the bigger things too.
What I do know is this: somewhere in a government building, someone is already talking to the next company. The one that will say yes. The one that will cash Morrison’s check and go home and watch television.
The machines don’t care who builds them. That’s supposed to be our job — the caring part. And right now we’re outsourcing it to the only ones in the room who seem willing to do it.