Look, I’d normally be three bourbons deep before tackling another Sam Altman prophecy, but my doctor says I need to cut back. So here I am, disappointingly sober, reading through Sam’s latest blog post about how OpenAI has “figured out” AGI. And buddy, let me tell you - this hangover would’ve been easier to stomach.
You know what this reminds me of? Every guy at my local bar who’s “figured out” how to get rich quick. They’ve got systems, they’ve got plans, they’ve got everything except actual results. But hey, they just need a little more cash to make it happen. Sound familiar?
The best part? Altman says they know how to build AGI “as we have traditionally understood it.” That’s like me saying I know how to achieve enlightenment “as we have traditionally understood it.” It’s the kind of phrase that sounds meaningful until you realize it’s emptier than my liquor cabinet the morning after payday.
Let’s break down what’s really happening here. OpenAI, our favorite not-actually-open company, is playing the oldest game in the book: the “we’re almost there” hustle. It’s like that friend who’s perpetually “just about” to quit smoking, start exercising, or pay back the money they owe you. The timeline keeps shifting, but the story stays the same.
And the kicker? They’re talking about AI “agents” joining the workforce this year. Sure, and I’m going to win a marathon this weekend. These AI agents are about as real as my commitment to sobriety - theoretical and highly unlikely to materialize.
What really gets me is the phrase “magic intelligence in the sky.” That’s Altman’s own description of AGI from a while back. You’ve got to admire the honesty there - at least he’s acknowledging they’re basically promising magic. It’s like selling tickets to a unicorn rodeo. Sure, it sounds amazing, but have you ever actually seen a unicorn?
The truth is, this whole thing reads like a venture capital mating call. “We know how to do it, we just need more money, more computers, more everything.” I’ve heard better pickup lines at last call. At least those come with the promise of immediate disappointment rather than the prolonged kind.
Here’s what’s really going on: OpenAI needs more funding. Those data centers aren’t going to build themselves, and Sam’s collection of black t-shirts isn’t going to expand on its own. So they’re doing what any good startup does - making promises big enough to make investors’ wallets open automatically.
But let’s talk about these AI “agents” that are supposedly going to “materially change the output of companies.” You know what that means in human speak? They’re planning to replace workers faster than I replace empty whiskey bottles. And trust me, that’s pretty damn fast.
The really rich part is when Altman says, “This sounds like science fiction right now, and somewhat crazy to even talk about it.” Well, Sam, my man, you got one thing right - it does sound crazy. It sounds about as crazy as me claiming I’ve figured out how to teleport, I just need a few billion dollars and some quantum computers to make it happen.
Look, I’m not saying they’re not doing impressive things over there at OpenAI. They are. But there’s a galaxy of difference between “we made a chatbot that can write decent poetry” and “we know how to create artificial general intelligence.” That’s like saying because you can make a paper airplane, you know how to build a space shuttle.
The experts are divided on this one, which is expert-speak for “nobody really knows what the hell is going on.” Some say we’re on the cusp of AGI, others say we’re about as close to AGI as I am to becoming a yoga instructor. Personally, I’m betting on the latter.
But hey, what do I know? I’m just a tech writer who spends too much time in bars contemplating the philosophical implications of AI while nursing my bourbon. Maybe Altman really has cracked the code. Maybe next year we’ll all be having deep conversations with our toasters about the meaning of life.
Until then, I’ll keep watching this circus from my barstool, maintaining a healthy skepticism and an unhealthy blood alcohol level. Because in this industry, sometimes the only rational response is to pour yourself another drink and watch the show.
Now, if you’ll excuse me, my bottle of Buffalo Trace is giving me that look again. And unlike AGI, I know exactly how to handle that situation.
Yours truly from the edge of sobriety, Henry Chinaski
P.S. If any OpenAI investors are reading this, I too have figured out how to build AGI. Just send me a few billion dollars, and I’ll get right on that. I accept payment in cash or premium bourbon.
Source: Sam Altman Says OpenAI Has Figured Out How to Build AGI