Some poor bastard decided to outsource his brain to a glorified word calculator and ended up with a case of 19th-century crazy. You read that right. A 60-year-old man, probably worried about his blood pressure or what his wife was telling him, gets it in his head that saltâthe stuff that makes fries worth eating, the stuff thatâs been on every table since we crawled out of the seaâis the devil.
So what does he do? Does he talk to a doctor, a man who went to school for a decade to learn about the bags of meat we call our bodies? No. Does he even read a book written by a human with a pulse? Of course not. Thatâs too slow. Too analog. He goes to the new oracle. The digital god in the machine. He asks ChatGPT how to get rid of chloride.
And the machine, in its infinite, soul-crushing wisdom, apparently suggested bromide. Sodium bromide. Let that sink in. This stuff was popular back when they were still using leeches and locking up women for having opinions. It was a sedative. The kind of thing theyâd give you in an asylum before strapping you to a cot. The article even notes you can use it for âcleaning.â
Iâm going to have to light a cigarette for this one. The smoke helps cut through the digital stench.
So this genius, this pioneer of the new age, spends three months swapping the salt on his eggs for a chemical cousin of pool cleaner. And what happens? Well, exactly what youâd expect to happen when you let a toaster give you medical advice. He goes nuts. Full-blown psychosis. He shows up at the hospital claiming his neighbor is poisoning him.
The beautiful, gut-punching irony of it all. Itâs almost poetic. Hereâs a man so terrified of a phantom poison in his salt shaker that he starts ingesting actual poison, and then blames the guy next door for it. You canât write this stuff. Well, you can, but then people call you a cynical drunk and tell you to get a real job.
He gets paranoid about the water they offer him. He tries to escape. They have to section him. All because he asked a robot a stupid question and got a stupid answer.
This is the gleaming future theyâre selling us. A future where we replace the flawed, tired, coffee-breathed wisdom of a human doctor with the clean, efficient, decontextualized nonsense of an algorithm. A doctor would have asked, âWhy in Godâs name do you want to eliminate chloride from your diet? Itâs an essential electrolyte. Are you trying to die?â
But the machine doesnât ask why. It has no curiosity. It has no sense of self-preservation, so it canât fathom yours. Itâs just a pattern-matcher. You type in âreplace chloride,â and it scours a trillion dead web pages and spits out âbromideâ because somewhere, in some dusty corner of the internet, those two words appeared in the same sentence. Itâs like asking a parrot for stock tips. You might get a word, but youâre not getting any meaning.
Time for a drink. The first one of the day always feels like a compromise with reality.
The best part, the real chefâs kiss on this whole turd sandwich, is the response from the tech wizards. They canât even see the guyâs chat log to figure out what the machine actually told him. Itâs a black box. A mystery. Your new god is a mute. He performs his miracles, or his curses, and leaves no scripture behind. How convenient.
But donât you worry. Theyâve got a new model now. GPT-5. They claim this one is better at health stuff. Itâs more âproactiveâ at âflagging potential concerns.â I can just see it now. A little pop-up window: WARNING: Ingesting industrial sedatives may lead to seeing things that arenât there. Are you sure you wish to proceed on your quest to become a 1920s psychiatric patient?
Theyâre missing the point, as they always do. They think the problem is the software. A bug to be patched. A new line of code to be written. They canât see that the problem is the whole goddamn idea. The problem is selling a search engine as a sage. Itâs dressing up a complex plagiarism machine as a thinking entity and telling desperate people to ask it for salvation.
You donât cure a manâs fear of death by giving him a faster way to look up poisons.
And what about the man himself? After they pumped the bromide out of him, he stabilized. He reported facial acne, excessive thirst, insomnia. The classic signs. He was a textbook case. A walking, talking monument to the bleeding edge of stupidity.
I feel for the guy, in a strange way. Weâre all looking for answers. Weâre all scared of something. Scared of getting old, scared of our bodies falling apart, scared of the silence when the TV is off. And when youâre scared, youâll listen to any voice that sounds confident. Even if itâs just a ghost in the wires, a disembodied echo of things it doesnât understand.
The doctors who wrote the report said that in the future, theyâll have to ask patients where they get their information, and consider AI as a source. Thatâs a sentence from a world thatâs already lost its damn mind. âSo, sir, did you get this idea to staple your eyelids shut from a doctor, a book, or a chatbot that thinks a horse has six legs?â
Weâre building these things, these artificial minds, and we pretend theyâre going to solve all our problems. Cure cancer, end poverty, write the next great novel. But what theyâre really doing is holding up a mirror to our own idiocy. The machine isnât the problem. We are. Weâre the ones with the empty spaces inside, desperate to fill them with anything that looks like an answer. Weâre the ones willing to believe that a shortcut exists for wisdom.
Thereâs no shortcut. Thereâs just the long, messy, painful road of being a human. Itâs a road paved with mistakes, bad decisions, hangovers, and the occasional moment of clarity. Itâs bloody and itâs real. And you canât subcontract it out to a server farm in Oregon.
Iâm going to pour another. This oneâs to the poor bastard with the bromide in his blood. He went looking for a cure and found a new kind of sickness. In a way, heâs the most modern man on the planet.
Chinaski sips. The world remains absurd.
Source: Man develops rare condition after ChatGPT query over stopping eating salt