The dentist’s waiting room had a TV in the corner, muted, captions on. Some morning show host was talking about wellness. I watched her mouth move and the words crawl across the bottom of the screen a half-second behind, always catching up but never quite syncing.
That’s how it feels now. The words never quite sync with what’s happening.
I read about a woman named Donna Fernihough whose carotid artery “blew” during sinus surgery. Blood sprayed all over the operating room. She had a stroke the same day. Another woman, Erin Ralph, same deal — surgeon punctures an artery, blood clot forms, stroke follows. Both of them just wanted their sinuses fixed. Chronic inflammation. The kind of thing you complain about at dinner parties. “My sinuses are killing me.” You don’t expect it to be literal.
The device that guided these surgeries is called the TruDi Navigation System. It’s supposed to tell the surgeon exactly where they are inside your skull. The company added AI to it a few years back, because of course they did. Everyone adds AI to everything now. Your toaster has AI. Your doorbell has AI. Your sinus surgery has AI.
The lawsuits say the device was safer before they added the machine learning. “The product was arguably safer before integrating changes in the software to incorporate artificial intelligence.” That’s the kind of sentence that would be funny if it wasn’t followed by descriptions of cerebrospinal fluid leaking from people’s noses.
The company set a goal of 80 percent accuracy for some of this technology before shoving it into the navigation system. Eighty percent. In an operating room. Inside your skull. One in five times, the machine doesn’t know where it is.
You know where else 80 percent is acceptable? Baseball. A pitcher with an 80 percent strike rate is having a good day. But you’re not performing surgery on my brain with a pitcher’s mentality. You don’t get to shrug off the miss and say “still batting four hundred.” Not when the miss means puncturing the base of someone’s skull.
The FDA has received at least 100 reports of malfunctions and adverse events with this device. At least ten patients injured. And the FDA itself is getting gutted — budget cuts have laid off or pushed out dozens of AI scientists. The people who are supposed to catch this stuff before it catches you are being shown the door.
Meanwhile, Doctor Oz — the TV doctor, the supplement hawker, the man who once told America that raspberry ketones were a “miracle fat burner” — is now running the Centers for Medicare & Medicaid Services. He was recently bragging about robots that can perform ultrasounds on pregnant women and “wands” that determine if your baby is okay.
“And frankly, I don’t have to see the image,” he said. “I just have to know if the image is good enough to tell me the child doesn’t have a problem.”
The guy in charge of Medicare doesn’t need to see the image. The machine will tell him. The machine that was built by people who thought 80 percent was good enough.
We’re losing something, and we’re not even noticing because the thing we’re losing is the ability to notice. Doctors are already struggling to identify cancer in scans because they’ve outsourced that judgment to AI. The machine learns, and we unlearn. The machine gets confident, and we forget why we ever doubted.
A surgeon used to spend years learning to read the geography of the human skull by feel and by instinct. The way the tool resists bone versus cartilage. The subtle feedback through the fingers that no screen can replicate. Now there’s a dot on a map that says “you are here” even when you’re somewhere else entirely. And the surgeon trusts the dot because why would the dot lie? The dot has AI.
Donna Fernihough’s lawsuit says the company “knew or should have known” that the AI caused the system to be “inconsistent, inaccurate, and unreliable.” They shipped it anyway. They lowered their safety standards to rush the technology to market. There’s a war on. An AI arms race. Everyone’s deploying, everyone’s integrating, everyone’s disrupting. And the people who get disrupted are the ones lying on the table with their skulls open.
The companies will win these lawsuits, probably. They always do. They’ll point to the fine print, the warnings, the disclaimers. They’ll say the surgeon made the final call. They’ll say the AI was just a navigation aid, a suggestion, a tool. The doctor pulled the trigger. The machine just showed him where to aim.
That’s the game now. The machine shows you where to aim, and when you miss, it’s your fault for listening.
I left the dentist’s office without getting my teeth cleaned. Just walked out. The receptionist called after me but I kept going. Some days you just can’t sit in a chair and let someone put tools in your mouth. Some days the trust isn’t there.
Outside, the sun was doing that winter thing where it’s too bright and not warm enough. I stood there squinting at it, thinking about Donna Fernihough. Blood spraying all over. The surgeon’s hands doing what the screen told them to do. The machine confident, even as it was wrong.
Four out of five.
The sun didn’t feel warm, but it was real. I could trust that much, at least.
Source: AI-Powered Surgery Tool Repeatedly Injuring Patients, Lawsuits Claim