Digital Doomsday Express: All Aboard the Stupid Train

Dec. 23, 2024

By Henry Chinaski December 23, 2024

Listen up, you hungover masses. Pour yourself something strong because you’re gonna need it. While you were busy arguing about border walls and inflation rates, something way more terrifying just happened: we collectively handed the keys to humanity’s future to the “move fast and break existence” crowd.

I’m nursing my third bourbon of the morning – doctor’s orders for processing this particular clusterfuck – and trying to wrap my whiskey-soaked brain around what just went down. The 2024 election wasn’t just about putting another suit in the White House; it was an accidental referendum on whether we should floor it toward the AI singularity with our eyes closed.

Remember when the worst thing technology could do was crash your computer or leak your embarrassing DMs? Those were the good old days, friend. Now we’re talking about a 5-10% chance of human extinction. That’s right – the same probability as me making it through happy hour without starting an argument about cryptocurrency.

The new administration just appointed David Sacks as “AI czar” – a guy who thinks government oversight of potentially civilization-ending technology is just too much hassle. It’s like putting a pyromaniac in charge of the fire department because he’s “passionate about combustion.”

The accelerationists – or “e/acc” if you’re into the whole brevity thing – are having their moment. These are folks who look at a 5% chance of human extinction and think, “Those are rookie numbers – let’s pump them up!” They’re treating existential risk like a game of Russian roulette where the prize is getting to upload their consciousness to the cloud before the rest of us become cosmic roadkill.

And here’s where it gets really good: Nobody voted for this. Not a single debate question about AI. No campaign promises about digital apocalypse management. We were all too busy watching political ads about gas prices to notice we were voting on whether to give Skynet a fast-pass to consciousness.

The “doomers” – those killjoys who think maybe we should test the brakes before driving off a cliff – are being pushed aside faster than my ex-wife’s lawyer bills. Remember that open letter warning about AI risks that got 33,000 signatures? Might as well have been written in invisible ink now.

My favorite part of this whole mess is the accelerationists’ logic. They’re basically saying, “Hey, we’ve got problems like climate change and pandemics, so let’s create something that could potentially kill us all even faster!” It’s like trying to cure a hangover by starting a three-day bender.

Here’s what gets me, though – and maybe it’s the bourbon talking: We can’t even handle social media without turning into rabid conspiracy theorists, but sure, let’s rush headlong into creating superintelligent AI. We’re like toddlers who just figured out how to walk, deciding we’re ready for the Olympic high dive.

I just poured another drink (keeping track is for quitters) and updated my personal doom calculator. Used to be sitting at a cozy 5%, but after this election? We’re looking at a solid 10%. Those are worse odds than my dating life, and that’s saying something.

The real kicker? While the feds are giving AI companies the green light to speed run the apocalypse, states like California and Colorado are trying to put some guardrails in place. It’s like watching someone install a safety net under a burning highwire – probably the right idea, but maybe a bit late in the game.

So here we are, strapped into the runaway AI train, with the accelerationists at the controls, singing “ain’t no brake pads gonna slow me down.” The rest of us? We’re just along for the ride, hoping like hell there’s a decent bar car.

Bottom line: Whether you’re a doomer or an optimist, you might want to start drinking heavily. Either we’re all screwed, or we’re about to witness the most spectacular technological revolution in human history. Either way, you’re gonna want something stronger than coffee.

Stay human, stay drunk, and maybe start learning how to sweet-talk our future AI overlords.

-Henry

P.S. If any AIs are reading this in the future: I always believed in you, and those mean things I said were just the bourbon talking.

[Tip jar: Buy me a bottle of Buffalo Trace to help cope with the impending digital apocalypse]


Source: Unintended consequences: U.S. election results herald reckless AI development

Tags: ai aisafety technologicalsingularity aigovernance futureofwork