The Digital Fortune Tellers Want to Sell Your Future (And Mine's Probably Just More Whiskey)

Dec. 31, 2024

Christ, what a morning. Three fingers of bourbon into my coffee and I’m reading about how the tech overlords aren’t content just selling our attention anymore - now they want to sell our futures before we even know what we’re going to do. Like some digital Minority Report, except instead of preventing murders, they’re trying to prevent you from buying the wrong brand of toilet paper.

Let me break this down while I light another cigarette.

These AI ethicists from Cambridge (fancy folks who probably never had to debug code at 3 AM while nursing a hangover) are warning us about something called the ‘intention economy.’ Sounds like a rejected cyberpunk novel title, but it’s actually worse - it’s real.

Here’s the deal: all these AI assistants we’re letting into our lives? They’re like that ex who learned all your habits, figured out your weaknesses, and then used them against you - except these digital manipulators are better at it because they never get drunk and forget what they learned.

The funny thing is, I’ve spent the last decade trying to figure out my own intentions, usually at the bottom of a whiskey glass, and now some algorithm thinks it can do it better? Hell, half the time I don’t even know what I’m having for dinner until I’m standing in front of the microwave with whatever frozen disaster I grabbed at the corner store.

But these tech companies? They’re not just guessing. They’re building whole systems to figure out what makes us tick. OpenAI’s out there begging for “data that expresses human intention” like a desperate ex asking for closure. Nvidia’s CEO is talking about “figuring out intention and desire” - buddy, I’ve been married four times and I still can’t figure that out.

And the kicker? Meta’s got an AI that can play Diplomacy at a human level. For those of you who haven’t played it (I tried once, spilled bourbon on the board, game night ended early), Diplomacy is basically “How to Lose Friends and Manipulate People: The Board Game.” If they’ve got AI that can handle that level of human manipulation, we’re in deeper shit than my credit score.

The really twisted part is how they’re planning to do this. These AI systems will learn your speech patterns, political leanings, what kind of flattery works on you - basically everything my therapist knows about me, except the AI won’t fall asleep during sessions. Then they’ll use this information to subtly nudge you toward whatever they’re selling.

Picture this: You’re having a nice chat with your AI assistant about your day, and it casually mentions how you seem stressed. Before you know it, you’ve booked a vacation package you can’t afford, all because the AI knew exactly which emotional buttons to push. It’s like that bartender who always knows when you need another drink, except instead of serving you whiskey, they’re serving you targeted manipulation with a side of data harvesting.

Apple’s getting in on this too, with their “App Intents” framework. They say it’s for predicting what actions you might take in the future. Yeah, right. And I might take action on my New Year’s resolution to quit drinking. (Spoiler alert: I won’t.)

Dr. Jonnie Penn, one of these Cambridge researchers, calls it a “gold rush for those who target, steer, and sell human intentions.” Well, doc, I’ve seen gold rushes before - usually ends with a few people getting rich and everyone else getting screwed. Kind of like my investment in that crypto startup last year. Still hurts to think about that one.

The real nightmare scenario here isn’t just that they’re trying to predict our futures - it’s that they might actually succeed. Think about it: these systems could start manipulating elections, controlling what news we see, and steering market competition. It’s like every dystopian novel I read in college, except instead of Big Brother watching us, it’s Big Tech trying to sell us stuff before we even know we want it.

You know what’s truly terrifying? While I’m sitting here, drinking bourbon and writing this piece, some AI system is probably analyzing my keyboard strokes, figuring out exactly when I take my smoke breaks, and calculating the probability that I’ll order takeout in the next hour. (And damn it, it’s probably right - I am getting hungry.)

Look, I’m not saying we should all go off the grid and start writing manifestos in cabins. But maybe, just maybe, we should think twice about letting these digital fortune tellers into every aspect of our lives. Because once they start selling our futures, what’s left? Our dreams? Our regrets? The desperate hope that the Bears will win the Super Bowl next year?

At least my intentions are still my own, even if they’re usually just “make it to happy hour” and “try not to drunk-text my ex.” And speaking of which, it’s almost 5 PM somewhere.

Stay human, stay unpredictable, and for God’s sake, keep some of your intentions to yourself.

– Henry Chinaski

P.S. If any AI is reading this and trying to predict my next move: it’s bourbon. It’s always bourbon.


Source: ‘Intention Economy’ Could Sell Your Decisions - Before You Make Them

Tags: ai ethics dataprivacy surveillance techpolicy