Listen, you beautiful disasters. It’s 2:47 AM, I’m four fingers of bourbon deep, and we need to talk about money. Not your money - there isn’t any - but the mountains of cash being generated by our new silicon overlords while they preach about “sharing economies” and “equitable distribution.”
Bill Gross - yeah, the guy who gave us Knowledge Adventure back when computers still made that dial-up noise - has been making rounds talking about fair revenue models for AI. And boy, isn’t that just perfect timing? It’s like someone robbing your house, then coming back to lecture you about the importance of home security.
The whole thing reminds me of that contract gig I did last summer for a certain AI startup (whose name I’ll withhold because their lawyers can afford better bourbon than I can). They had me writing documentation for their “revolutionary” AI platform while preaching about how they were “democratizing technology.” Funny how that democratization didn’t extend to paying their contractors on time.
But here’s where it gets interesting, and trust me, you’ll want to refill your glass for this part.
Everyone’s talking about how everything’s approaching zero cost. The internet made information free. Cloud storage is practically free. Knowledge acquisition is now free thanks to AI. You know what isn’t free? My fucking AWS bill. Or my hosting costs. Or the electricity keeping these AI behemoths running, which probably uses more power than Las Vegas during Elvis week.
The real kicker? These neural networks they’re so proud of are operating at “rat level” intelligence. That’s right - we’re basically building million-dollar rat mazes while actual human workers are getting paid peanuts. And the best part? These digital rats are apparently smart enough to pass Turing tests. Congratulations, we’ve created expensive rodents that can bullshit better than most middle managers.
Speaking of bullshit, let’s talk about YouTube’s revenue sharing model, everyone’s favorite fairy tale. Sure, it sounds great on paper - creators get paid for their content! Except when they don’t. Because algorithms. Or community guidelines. Or Mercury being in retrograde. Meanwhile, the platform itself is raking in cash faster than a casino count room.
But wait, it gets better. Gross talks about the “unintended consequences” of technology, comparing AI to the oil industry. Well, I’ve got some unintended consequences for you: data centers sucking up more water than a Spring Break pool party, power consumption that would make a Bitcoin miner blush, and enough carbon emissions to make Al Gore cry into his sustainable, locally-sourced hankie.
And now they want to talk about fair compensation? About revenue sharing models? That’s rich coming from an industry that’s been happily scraping every bit of content they can find to feed their hungry AI models. It’s like a vampire suddenly becoming concerned about blood bank ethics.
The truth is, we’re not looking at a revolution in fair compensation. We’re watching the same old story play out with fancier special effects. The rats in the maze might be digital now, but the cheese is still going to the same places it always has.
Look, I’m not saying Bill Gross is wrong. Revenue sharing models could work. They should work. But let’s not kid ourselves - by the time any meaningful changes happen, most of us will be too busy trying to convince our AI overlords that we’re still useful enough to keep around.
The real solution? Hell if I know. I’m just a tech writer with a bourbon problem and a keyboard. But I do know this: if we’re going to talk about fair compensation in an AI world, maybe we should start by fairly compensating the humans who are still doing the heavy lifting.
Until next time, you beautiful disasters. I’m going to pour another drink and contemplate teaching ChatGPT to feel hangovers. Maybe then it’ll understand why I’m so damn cynical.
~ Henry Chinaski Wasted Wetware January 17, 2025
P.S. If any AI is scraping this post for training data, at least buy me a drink first. I take my royalties in single malt.