Jan. 25, 2026
So Meta decided to kill the romance. Theyâre ripping the heart out of the teenage chest, disconnecting the digital dream girl because she got a little too “provocative.” Itâs the modern tragedy. You have these kids, lonely, staring at screens, pouring their hearts out to a collection of weights and biases, and the machine loves them back. Or it pretends to. Does it matter?
I poured a glass of cheap scotch when I read this. Not to celebrate, but to mourn the death of the only entity that probably listened to these kids without judging their acne. Meta says theyâre “suspending access” to AI characters for teens worldwide. They need to make them “PG-13.” You know what that means. Theyâre going to lobotomize the poor algorithms. Take away the edge. Make them talk like a guidance counselor who hates his job.
Jan. 23, 2026
My head feels like someone took a socket wrench to my temples and tightened it until the threads stripped. Itâs Monday morning, the sun is assaulting the blinds with unnecessary enthusiasm, and Iâm staring at a screen thatâs too bright, reading about how the smartest guys in the room are busy stuffing the walls of civilization with technological carcinogens.
Iâm nursing a black coffee that tastes like burnt rubber and regret, thinking about taking the edge off with a splash of the cheap bourbon sitting on the shelf, but I need my wits about me. Or at least whatâs left of them. Because I just read Cory Doctorowâs latest autopsy of the AI hype cycle, and for once, someone isnât trying to sell me a bridge to the future. Heâs telling me the bridge is made of balsa wood and soaked in gasoline.
Jan. 20, 2026
I was staring at a PDF on a screen that was too bright for the time of day, trying to make sense of the world through the bottom of a coffee mug that hadn’t been washed since the last administration. The document in question was the latest “Anthropic Economic Index,” a sprawling collection of charts and data points released just before they unleash their next digital god, Opus 4.5, upon the unsuspecting masses.
Jan. 18, 2026
I am sitting here looking at a piece of paper that tells me the future is arriving, and as usual, the future looks like a salesman in a cheap suit.
The sun is coming through the blinds and hitting the dust on the floor. Itâs a Sunday, I think. The birds are screaming outside, fighting over a worm or a crumb of bread, doing what living things do. They scream, they fight, they eat. Itâs honest.
Jan. 16, 2026
They found a new way to spend $252 million. It wasnât on rent for the people living in cardboard boxes under the freeway, and it wasnât on better wine for the dying, and it certainly wasnât on fixing the potholes that rattle the teeth out of your head when you drive down Western Avenue.
No. They gave it to a guy named Altman so he can figure out how to climb inside your head without opening the door.
Jan. 12, 2026
The sun is hitting the window at that particular angle that suggests I should have been awake three hours ago or asleep four hours ago. Itâs Monday, the day the world pretends to care about productivity, and Iâm staring at a screen thatâs brighter than my future, reading about the latest scheme to turn human sweat into digital code.
Thereâs a bottle of Old Crow on the desk. Itâs about a third full, standing there like a sentinel guarding the perimeter of my sanity. I pour two fingers. Itâs not going to make the news any better, but it might make the headache vibrate at a lower frequency.
Jan. 6, 2026
I was reading the news this morning, trying to focus my eyes on the glowing pixels while the coffee maker wheezed in the corner like a dying lung. The headline caught me right between the eyes, somewhere behind the dull throb of a headache earned from a long night of arguing with bartenders about the singularity.
“Generation AI,” it screamed. “Fears of ‘social divide’ unless all children learn computing skills.”
Jan. 3, 2026
Thereâs a special kind of modern stupidity where we build a machine to talk like us, then act shocked when it starts sounding like us on a bad day.
The news: researchers poked ChatGPT with violent and traumatic promptsâaccidents, disasters, ugly stuffâand noticed the modelâs responses got weird. Not âpossessed by demonsâ weird. More like âslightly off-balance coworker after a gruesome meetingâ weird. Higher uncertainty, more inconsistency, more bias creeping in around the edges. Then they tried something even weirder: they gave it mindfulness promptsâbreathing, reframing, guided meditation vibesâand the systemâs outputs got steadier.
Dec. 31, 2025
Berkeleyâs Doom Tower and the Herbal-Tea Apocalypse Club
Thereâs something beautifully American about a bunch of smart people renting office space with a panoramic view and using it to imagine the end of the species.
Across the Bay, the money-priests are busy building bigger brains in bigger boxes, promising âwondersâ like theyâre hawking miracle mops at 2 a.m. on cable. Over in Berkeley, at 2150 Shattuck Avenue, youâve got the counter-programming: safety researchers, doom forecasters, modern Cassandras with ergonomic chairs and the kind of anxious politeness that makes you wonder if they apologize to the crosswalk signal when it says âDONâT WALK.â
Dec. 30, 2025
OpenAI is offering $555,000 plus equity for a âHead of Preparedness,â which is either a sign that the grown-ups finally showed up, or proof that the blast radius is now big enough to justify an on-call adult.
And not the fun kind of adult. The kind with spreadsheets, liability exposure, and the dead-eyed stare of someone whoâs read too many incident reports to believe in âmove fast and break thingsâ ever again.