Your New Financial Planner Is a Soulless Robot

Sep. 14, 2025

Another morning, another reason to wonder if the whole human experiment hasn’t finally jumped the shark. I’m staring at my screen, the glow of it mocking the empty coffee pot, and I come across a story that’s so perfectly, beautifully stupid it almost makes me want to believe in a higher power, just so I can curse him for his sense of humor.

The news, delivered by the grey lady herself, is that people are now turning to ChatGPT for financial advice.

Let that sink in.

We’re outsourcing our desperation. We’ve got folks, drowning in a sea of credit card debt and car payments, who are too proud or too scared to talk to another human being—say, a father who’s a goddamn financial planner—but have no problem spilling their guts to a glorified auto-complete function.

Take this Myra Donohue. Has a background in accounting, but the numbers are giving her the shakes. Five grand in the hole, a partner out of work, two kids. It’s the classic American nightmare, gift-wrapped with a white picket fence you can’t afford. Her old man offers to help, but no, she’s got it covered. Her pride is a fortress. A fortress that crumbles the minute she opens her laptop. She plugs her whole sorry situation into a chatbot and asks it to make a budget.

And the machine does it. In seconds. It spits out a “zero-based budget,” which is a fancy way of saying, “figure out where every one of your pathetic dollars is going.” She’s pleased. Not because the advice was revolutionary—it was the same cookie-cutter stuff you’d find on any money blog—but because it was fast. And it didn’t judge her.

Of course it didn’t judge her. It’s a machine. It’s a network of circuits and algorithms that would give you the same sterile, passionless advice if you were trying to save for a house or figure out the best way to cook a hamster. It has no concept of the cold sweat you feel when you see the repo man drive down your street. It doesn’t know the bitter taste of cheap whiskey when that’s all you can afford. It just regurgitates patterns it scraped from the internet.

And here’s the beautiful, ugly truth of it all: a survey found that two-thirds of people who’ve used this crap have used it for money advice. And 80 percent of those people said it helped. That’s what they say. But then you keep reading. You push past the shiny PR-friendly numbers and find the grit. The same survey found that more than half of the people who actually acted on the advice made a poor decision.

A coin toss. You’re letting a coin toss decide if you eat or if the bank eats you.

Then we get to Jennifer Allan. A real estate agent with a newborn and $23,000 in credit card debt. She’s staring into the abyss, and what does she do? She turns to the machine and says, “Help me.” And the machine, in its infinite, plagiarized wisdom, launches her on a 30-day “challenge.”

This is where the story stops being a tragedy and becomes a full-blown circus. The chatbot has her selling a watermelon with her debt total tattooed on it for $51. It has her donating plasma for 80 bucks, even though she’s terrified of needles. It even suggests she sell pictures of her feet. She actually did it, for a day, before the shame got to be too much and she deleted the post.

Let’s pause here. I need a cigarette.

Think about what we’re seeing. A woman is so desperate that she’s taking life advice from a machine that thinks selling foot pics is a viable part of a diversified financial portfolio. And people are documenting this on TikTok like it’s some kind of heroic journey. It’s not heroic. It’s a goddamn freak show. It’s what happens when you strip all the humanity out of a human problem. You’re left with absurdity. You’re left with debt-ridden watermelons and a momentary flirtation with the world of online perverts. At least when you hit the dog track and lose your shirt, you get a story out of it. You get the smell of stale beer and cheap cigars. You get to see the desperation in other men’s eyes. You get to feel something real.

And then there’s the stock market genius, Alexander Stuart. Fresh off a breakup, stuck in a dead-end job. He’s got 400 bucks to his name and a dream. He asks ChatGPT to act as a “free college” to teach him how to “become one of the greatest traders.”

The sheer, unadulterated hubris. It’s breathtaking. It’s like asking a parrot to teach you astrophysics.

So the machine tells him to buy AMD stock. He does. And his $400 doubles. He’s king of the world. He’s on top of the goddamn mountain. He’s probably already picking out the color for his Lamborghini. He starts trading daily, following the bot’s every command. His account balloons to a whopping $1,600. That’s a good weekend for some guys I know at the bar, but for him, it’s a revelation.

But then, the other shoe drops. It always does. The bot tells him to make a move on Nvidia based on “new” data. He loses $60. Turns out the data was days old. The all-knowing oracle was just reading yesterday’s newspaper. The magic trick was revealed for what it was: a cheap illusion. Now he’s a “sophisticated” investor, cross-referencing ChatGPT’s advice with another chatbot. He’s fighting fire with fire, or more accurately, fighting one idiot with another.

What this all comes down to isn’t the technology. I don’t give a damn about the large language models or the neural networks. This is about us. It’s about being so terrified of looking another person in the eye and admitting failure that we’d rather confess our sins to a toaster. We want a “professional kind of service,” as Myra said, but we don’t want to pay for it, and we don’t want the discomfort of a real conversation. We want a priest who won’t assign penance. A bartender who won’t cut you off.

The machine gives you a plan. It gives you tasks. It makes you feel like you’re doing something. You’re not just wallowing in your own failure; you’re proactively engaging with a data-driven solution. It’s all bullshit, of course. It’s the illusion of control. Selling a watermelon with your debt on it doesn’t fix the systemic problem that put you there. It’s just a bizarre, sad performance for an online audience.

I’d rather take my financial advice from the guy three stools down who smells like cabbage and claims to have invented a new kind of bottle cap. His advice is probably just as bad, but at least he’s real. At least when he’s lying to me, he’s doing it with his whole, pathetic, human heart. He’s not just executing a command.

These people think they’re hacking the system. They think they’ve found a shortcut. But all they’re doing is taking the messiest, most painful, most human parts of life—money, fear, hope, failure—and feeding them into a sterile black box that will never understand any of it. It doesn’t know the joy of a win or the gut-punch of a loss. It just knows the next word in the sequence.

And you know what? Maybe that’s the appeal. It’s a clean failure. A sanitized disaster. No mess, no tears, just an error message. But I’ll take my disasters messy, thank you very much. I want them human. I want them soaked in booze and regret. It’s the only way you know you’re still alive.

Time to go calculate my own liquid assets. One glass at a time.


Source: How People Are Using ChatGPT for Financial Advice

Tags: ai chatbots ethics aisafety humanainteraction