My Digital Soul is a Snitch, and My Smart Toaster is Judging Me

Sep. 30, 2025

The screen glows with the kind of artificial hope that only people who’ve never had a real job can manufacture. It’s some fluff piece from Forbes, a magazine for men who iron their goddamn socks. The headline promises “Precision Mental Health,” boosted by AI. Precision. Sounds clinical. Like a bomb, or a surgeon’s knife. They want to get precise about the mess inside our heads. Good luck with that. My head is a dive bar at 3 a.m. full of ghosts arguing over the jukebox. You can’t bring a ruler in there.

The suits at some Stanford conference are patting each other on the back about it. They’re using Large Language Models—the same things that write shitty poetry and tell you how to boil an egg—to untangle the knots of human misery. The whole idea of “precision mental health” is to tailor therapy to the individual. The article admits that this should be obvious, that therapy shouldn’t be a one-size-fits-all game. No shit. I didn’t need an AI to tell me that my brand of despair isn’t the same as my neighbor’s, who cries because his prize-winning roses got a fungus. I cry because I remember everything.

But these geniuses, they’ve got a new angle. They’re not just talking, they’re scanning. They’re hooking people up to fMRIs and using AI to find “biotypes” for depression. They look at the wiring in your brain and put you in a box. You’re not just sad anymore; you’re a “cognitive biotype” or a “tension biotype.” It’s a high-tech horoscope for the clinically morose.

I can see it now. You stumble into the clinic, smelling of stale cigarettes and last night’s bourbon. They slide you into a clean white tube that hums like a UFO. A half-hour later, a doctor in a lab coat who looks like he’s never been in a fistfight holds up a colorful picture of your brain. “Congratulations, Mr. Chinaski,” he’ll say. “Your brain scan shows a 93% probability of the ‘what’s-the-goddamn-point’ biotype. Your circuits for hope have been rerouted to your liver.” Thanks, doc. The bartender could’ve told you that for a lot less. And he would’ve poured me a drink while he did it.

But it gets better. It always gets better with these people. The next trick up their sleeve is the “AI Digital Twin.” They want to create a computer simulation of you. A little Henry Chinaski living in the machine. They can poke and prod this digital ghost, throw different therapies at him, see what makes him tick without messing up the real thing.

The idea is stolen from factories, where they make digital twins of machines to see when they’ll break down. Extending that to people? It’s a special kind of beautiful madness. My digital twin… what would he be like? Would he have my bad back and my collection of parking tickets? Would he wake up with a mouth that tastes like a graveyard? Or would they give him a clean slate? Maybe my digital twin would be a morning person. Maybe he’d take up jogging and quit smoking. They’d “cure” him in the machine, and he’d be a well-adjusted, productive member of a digital society. Meanwhile, I’m still here, the original model, coughing on the smog of a Tuesday afternoon, trying to find a clean sock. Do I get a discount on the therapy since they practiced on the stunt double? What if they decide the twin is the superior model and try to delete the original? I’ve seen that movie. It doesn’t end well for the guy made of meat.

And then there’s the grand finale, the real kicker. They call it “ambient intelligence.” It’s not enough to scan your brain and build a voodoo doll of your psyche. No, they want to watch you. All the time. They want to use the cameras in your smart home, the microphone in your phone, the sensors in your watch, the goddamn smart thermostat, to build a complete profile of your mental state.

They’ll analyze the “gait of your walk” as you stumble to the kitchen for a glass of water. Is he walking with depressed shoulders? Is that an anxious shuffle? No, you bastards, I’m walking funny because I slept on the floor again and my leg is asleep. They’ll listen to the tone of my voice. They’ll probably have my smart fridge report me to the authorities for the science experiments growing in the vegetable crisper. “Subject exhibits signs of culinary nihilism and disregard for expiration dates.”

This isn’t healthcare; it’s the quiet, bloodless construction of a prison where the walls are made of data and the warden is an algorithm. All your little imperfections, your glorious human flaws, your private sadnesses—they all become data points in a chart. Another way to be measured and found wanting. I like my misery to be mine. It’s one of the few things I truly own. I don’t need a machine to catalog it for a future shareholder meeting.

The article tells this touching story about a doctor who, after giving advice to the wife of a dying man, went home and asked ChatGPT what it would have said. The AI gave a lovely, sensitive, empathetic response. The doctor was “humbled.” Humbled? I’d say he was spooked. The machine faked humanity better than he could. The key word is faked. The AI doesn’t feel a thing. It’s just a pattern-matching machine, regurgitating the most statistically “caring” words from the billions of pages it ate. It’s the ultimate psychopath—perfectly mimicking emotion without a trace of it.

And the punchline to this whole cosmic joke is that these systems are going to create a “huge demand for therapists.” The AI will flag millions of us for not being happy enough, for walking with a ‘sub-optimal gait’, and refer us all to a human. It’s a perpetual motion machine of manufactured illness. We build machines to tell us we’re broken, then we pay people to listen to us complain about the machines. It’s perfect.

The article ends with a quote from Eleanor Roosevelt about the future belonging to those who believe in dreams. Good for her. My dreams usually involve me being naked in the post office trying to sort mail that’s on fire. What these tech saviors call a dream, I call a nightmare in a clean room. They think they can file down the sharp edges of the human soul with algorithms. They think they can fix the chaos with code.

They can’t. The ache is part of the deal. The absurdity is the price of admission. You don’t cure it. You just live with it. You find a woman who laughs at your bad jokes, a bottle of something strong enough to kill the ghosts for a few hours, and a typewriter that doesn’t judge you. That’s all the precision you need.

Time for some precision of my own. Straight, no chaser.


Source: Precision Mental Health Gets Precisely Boosted Via Innovative Uses Of Advanced AI And LLMs

Tags: ai dataprivacy surveillance digitalethics humanainteraction